GPUImage源碼閱讀(五)

概述

GPUImage是一個著名的圖像處理開源庫缸榛,它讓你能夠在圖片诗茎、視頻搏熄、相機(jī)上使用GPU加速的濾鏡和其它特效卦睹。與CoreImage框架相比瘦锹,可以根據(jù)GPUImage提供的接口籍嘹,使用自定義的濾鏡。項目地址:https://github.com/BradLarson/GPUImage
這篇文章主要是閱讀GPUImage框架中的GPUImageVideoCamera弯院、GPUImageStillCamera辱士、GPUImageMovieWriter、GPUImageMovie這幾個類的源碼听绳。在介紹 GPUImage源碼閱讀(四) 的時候颂碘,數(shù)據(jù)源主要來自于圖片與UI的渲染,這篇文章主要介紹數(shù)據(jù)源來自相機(jī)、音視頻文件的情況头岔。同樣的顯示畫面會用到上次介紹的GPUImageView塔拳,這里還會用GPUImageMovieWriter保存錄制的音視頻文件。以下是源碼內(nèi)容:
GPUImageVideoCamera
GPUImageStillCamera
GPUImageMovieWriter
GPUImageMovie

實現(xiàn)效果

  • 錄制視頻
錄制視頻.gif
  • 拍照
拍照.png
  • 視頻轉(zhuǎn)碼與濾鏡
原視頻.gif
濾鏡處理后的視頻.gif

GPUImageVideoCamera

GPUImageVideoCamera繼承自GPUImageOutput峡竣,實現(xiàn)了 AVCaptureVideoDataOutputSampleBufferDelegateAVCaptureAudioDataOutputSampleBufferDelegate 協(xié)議靠抑。GPUImageVideoCamera可以調(diào)用相機(jī)進(jìn)行視頻拍攝,拍攝之后會生成幀緩存對象适掰,我們可以使用GPUImageView顯示颂碧,也可以使用GPUImageMovieWriter保存為視頻文件。同時也提供了GPUImageVideoCameraDelegate 代理类浪,方便我們自己處理CMSampleBuffer载城。在處理視頻的時候會涉及到以下概念:

kCVPixelFormatType_32BGRA
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
kCVPixelFormatType_420YpCbCr8BiPlanarFullRange

這幾種像素格式,由于在之前在 OpenGL ES入門11-相機(jī)視頻渲染 做了詳細(xì)的介紹费就,這里便不多介紹诉瓦,如有需要請參考之前的文章。

  • 屬性列表力细。屬性中大多數(shù)是與相機(jī)相關(guān)的相關(guān)參數(shù)睬澡。
// AVCaptureSession是否在運(yùn)行
@property(readonly, nonatomic) BOOL isRunning;

// AVCaptureSession對象
@property(readonly, retain, nonatomic) AVCaptureSession *captureSession;

// 視頻輸出的質(zhì)量、大小的控制參數(shù)艳汽。如:AVCaptureSessionPreset640x480
@property (readwrite, nonatomic, copy) NSString *captureSessionPreset;

// 視頻的幀率
@property (readwrite) int32_t frameRate;

// 正在使用哪個相機(jī)
@property (readonly, getter = isFrontFacingCameraPresent) BOOL frontFacingCameraPresent;
@property (readonly, getter = isBackFacingCameraPresent) BOOL backFacingCameraPresent;

// 實時日志輸出
@property(readwrite, nonatomic) BOOL runBenchmark;

// 正在使用的相機(jī)對象猴贰,方便設(shè)置參數(shù)
@property(readonly) AVCaptureDevice *inputCamera;

// 輸出圖片的方向
@property(readwrite, nonatomic) UIInterfaceOrientation outputImageOrientation;

// 前置相機(jī)水平鏡像
@property(readwrite, nonatomic) BOOL horizontallyMirrorFrontFacingCamera, horizontallyMirrorRearFacingCamera;

// GPUImageVideoCameraDelegate代理
@property(nonatomic, assign) id<GPUImageVideoCameraDelegate> delegate;
  • 初始化方法。
- (id)initWithSessionPreset:(NSString *)sessionPreset cameraPosition:(AVCaptureDevicePosition)cameraPosition;

GPUImageVideoCamera初始化方法比較少河狐,初始化的時候需要傳遞視頻質(zhì)量以及使用哪個相機(jī)。如果直接調(diào)用 - (instancetype)init 則會使用 AVCaptureSessionPreset640x480AVCaptureDevicePositionBack 來初始化瑟捣。

- (id)initWithSessionPreset:(NSString *)sessionPreset cameraPosition:(AVCaptureDevicePosition)cameraPosition; 
{
    if (!(self = [super init]))
    {
        return nil;
    }
    
    // 創(chuàng)建音視頻處理隊列
    cameraProcessingQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH,0);
    audioProcessingQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_LOW,0);
    
    // 創(chuàng)建信號量
    frameRenderingSemaphore = dispatch_semaphore_create(1);

    // 變量的初始化
    _frameRate = 0; // This will not set frame rate unless this value gets set to 1 or above
    _runBenchmark = NO;
    capturePaused = NO;
    outputRotation = kGPUImageNoRotation;
    internalRotation = kGPUImageNoRotation;
    captureAsYUV = YES;
    _preferredConversion = kColorConversion709;
    
    // 根據(jù)傳入?yún)?shù)獲取前后相機(jī)
    _inputCamera = nil;
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *device in devices) 
    {
        if ([device position] == cameraPosition)
        {
            _inputCamera = device;
        }
    }
    
    // 獲取相機(jī)失敗馋艺,立即返回
    if (!_inputCamera) {
        return nil;
    }
    
    // 創(chuàng)建會話對象
    _captureSession = [[AVCaptureSession alloc] init];
    
    // 開始配置
    [_captureSession beginConfiguration];
    
    // 創(chuàng)建video輸入對象
    NSError *error = nil;
    videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:_inputCamera error:&error];
    if ([_captureSession canAddInput:videoInput]) 
    {
        [_captureSession addInput:videoInput];
    }
    
    // 創(chuàng)建video輸出對象
    videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    [videoOutput setAlwaysDiscardsLateVideoFrames:NO];
    
//    if (captureAsYUV && [GPUImageContext deviceSupportsRedTextures])
    // 設(shè)置YUV的處理方式
    if (captureAsYUV && [GPUImageContext supportsFastTextureUpload])
    {
        BOOL supportsFullYUVRange = NO;
        NSArray *supportedPixelFormats = videoOutput.availableVideoCVPixelFormatTypes;
        for (NSNumber *currentPixelFormat in supportedPixelFormats)
        {
            if ([currentPixelFormat intValue] == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
            {
                supportsFullYUVRange = YES;
            }
        }
        
        if (supportsFullYUVRange)
        {
            // 設(shè)置kCVPixelFormatType_420YpCbCr8BiPlanarFullRange格式
            [videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
            isFullYUVRange = YES;
        }
        else
        {
            // 設(shè)置kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange格式
            [videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
            isFullYUVRange = NO;
        }
    }
    else
    {
        // 設(shè)置kCVPixelFormatType_32BGRA格式
        [videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
    }
    
    // 創(chuàng)建GL程序、獲取屬性位置
    runSynchronouslyOnVideoProcessingQueue(^{
        
        if (captureAsYUV)
        {
            [GPUImageContext useImageProcessingContext];
            //            if ([GPUImageContext deviceSupportsRedTextures])
            //            {
            //                yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVVideoRangeConversionForRGFragmentShaderString];
            //            }
            //            else
            //            {
            if (isFullYUVRange)
            {
                yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVFullRangeConversionForLAFragmentShaderString];
            }
            else
            {
                yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVVideoRangeConversionForLAFragmentShaderString];
            }

            //            }
            
            if (!yuvConversionProgram.initialized)
            {
                [yuvConversionProgram addAttribute:@"position"];
                [yuvConversionProgram addAttribute:@"inputTextureCoordinate"];
                
                if (![yuvConversionProgram link])
                {
                    NSString *progLog = [yuvConversionProgram programLog];
                    NSLog(@"Program link log: %@", progLog);
                    NSString *fragLog = [yuvConversionProgram fragmentShaderLog];
                    NSLog(@"Fragment shader compile log: %@", fragLog);
                    NSString *vertLog = [yuvConversionProgram vertexShaderLog];
                    NSLog(@"Vertex shader compile log: %@", vertLog);
                    yuvConversionProgram = nil;
                    NSAssert(NO, @"Filter shader link failed");
                }
            }
            
            yuvConversionPositionAttribute = [yuvConversionProgram attributeIndex:@"position"];
            yuvConversionTextureCoordinateAttribute = [yuvConversionProgram attributeIndex:@"inputTextureCoordinate"];
            yuvConversionLuminanceTextureUniform = [yuvConversionProgram uniformIndex:@"luminanceTexture"];
            yuvConversionChrominanceTextureUniform = [yuvConversionProgram uniformIndex:@"chrominanceTexture"];
            yuvConversionMatrixUniform = [yuvConversionProgram uniformIndex:@"colorConversionMatrix"];
            
            [GPUImageContext setActiveShaderProgram:yuvConversionProgram];
            
            glEnableVertexAttribArray(yuvConversionPositionAttribute);
            glEnableVertexAttribArray(yuvConversionTextureCoordinateAttribute);
        }
    });
    
    // 設(shè)置AVCaptureVideoDataOutputSampleBufferDelegate代理
    [videoOutput setSampleBufferDelegate:self queue:cameraProcessingQueue];
    // 添加輸出
    if ([_captureSession canAddOutput:videoOutput])
    {
        [_captureSession addOutput:videoOutput];
    }
    else
    {
        NSLog(@"Couldn't add video output");
        return nil;
    }
    
    // 設(shè)置視頻質(zhì)量
    _captureSessionPreset = sessionPreset;
    [_captureSession setSessionPreset:_captureSessionPreset];

// This will let you get 60 FPS video from the 720p preset on an iPhone 4S, but only that device and that preset
//    AVCaptureConnection *conn = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
//    
//    if (conn.supportsVideoMinFrameDuration)
//        conn.videoMinFrameDuration = CMTimeMake(1,60);
//    if (conn.supportsVideoMaxFrameDuration)
//        conn.videoMaxFrameDuration = CMTimeMake(1,60);
    
    // 提交配置
    [_captureSession commitConfiguration];
    
    return self;
}
  • 其它方法迈套。GPUImageVideoCamera方法大致分為這幾類:1捐祠、添加輸入輸出設(shè)備;2桑李、捕獲視頻踱蛀;3、處理音視頻贵白;4率拒、相機(jī)參數(shù)設(shè)置;
// 添加禁荒、移除音頻輸入輸出
- (BOOL)addAudioInputsAndOutputs;
- (BOOL)removeAudioInputsAndOutputs;

// 移除所有輸入輸出
- (void)removeInputsAndOutputs;

// 開始猬膨、關(guān)閉、暫停呛伴、恢復(fù)相機(jī)捕獲
- (void)startCameraCapture;
- (void)stopCameraCapture;
- (void)pauseCameraCapture;
- (void)resumeCameraCapture;

// 處理音視頻
- (void)processVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer;
- (void)processAudioSampleBuffer:(CMSampleBufferRef)sampleBuffer;

// 獲取相機(jī)相關(guān)參數(shù)
- (AVCaptureDevicePosition)cameraPosition;
- (AVCaptureConnection *)videoCaptureConnection;
+ (BOOL)isBackFacingCameraPresent;
+ (BOOL)isFrontFacingCameraPresent;

// 變換相機(jī)
- (void)rotateCamera;

// 獲取平均幀率
- (CGFloat)averageFrameDurationDuringCapture;

// 重置相關(guān)基準(zhǔn)
- (void)resetBenchmarkAverage;

雖然GPUImageVideoCamera方法比較多勃痴,但是內(nèi)部的邏輯不是很復(fù)雜谒所。

// 增加音頻輸入輸出
- (BOOL)addAudioInputsAndOutputs
{
    if (audioOutput)
        return NO;
    
    [_captureSession beginConfiguration];
    
    _microphone = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    audioInput = [AVCaptureDeviceInput deviceInputWithDevice:_microphone error:nil];
    if ([_captureSession canAddInput:audioInput])
    {
        [_captureSession addInput:audioInput];
    }
    audioOutput = [[AVCaptureAudioDataOutput alloc] init];
    
    if ([_captureSession canAddOutput:audioOutput])
    {
        [_captureSession addOutput:audioOutput];
    }
    else
    {
        NSLog(@"Couldn't add audio output");
    }
    [audioOutput setSampleBufferDelegate:self queue:audioProcessingQueue];
    
    [_captureSession commitConfiguration];
    return YES;
}

// 移除音頻輸入輸出
- (BOOL)removeAudioInputsAndOutputs
{
    if (!audioOutput)
        return NO;
    
    [_captureSession beginConfiguration];
    [_captureSession removeInput:audioInput];
    [_captureSession removeOutput:audioOutput];
    audioInput = nil;
    audioOutput = nil;
    _microphone = nil;
    [_captureSession commitConfiguration];
    return YES;
}

// 移除所有輸入輸出
- (void)removeInputsAndOutputs;
{
    [_captureSession beginConfiguration];
    if (videoInput) {
        [_captureSession removeInput:videoInput];
        [_captureSession removeOutput:videoOutput];
        videoInput = nil;
        videoOutput = nil;
    }
    if (_microphone != nil)
    {
        [_captureSession removeInput:audioInput];
        [_captureSession removeOutput:audioOutput];
        audioInput = nil;
        audioOutput = nil;
        _microphone = nil;
    }
    [_captureSession commitConfiguration];
}

// 開始捕獲
- (void)startCameraCapture;
{
    if (![_captureSession isRunning])
    {
        startingCaptureTime = [NSDate date];
        [_captureSession startRunning];
    };
}

// 停止捕獲
- (void)stopCameraCapture;
{
    if ([_captureSession isRunning])
    {
        [_captureSession stopRunning];
    }
}

// 暫停捕獲
- (void)pauseCameraCapture;
{
    capturePaused = YES;
}

// 恢復(fù)捕獲
- (void)resumeCameraCapture;
{
    capturePaused = NO;
}

// 處理視頻
- (void)processVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer;
{
    if (capturePaused)
    {
        return;
    }
    
    CFAbsoluteTime startTime = CFAbsoluteTimeGetCurrent();
    CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(sampleBuffer);
    // 獲取視頻寬高
    int bufferWidth = (int) CVPixelBufferGetWidth(cameraFrame);
    int bufferHeight = (int) CVPixelBufferGetHeight(cameraFrame);
    CFTypeRef colorAttachments = CVBufferGetAttachment(cameraFrame, kCVImageBufferYCbCrMatrixKey, NULL);
    if (colorAttachments != NULL)
    {
        if(CFStringCompare(colorAttachments, kCVImageBufferYCbCrMatrix_ITU_R_601_4, 0) == kCFCompareEqualTo)
        {
            if (isFullYUVRange)
            {
                _preferredConversion = kColorConversion601FullRange;
            }
            else
            {
                _preferredConversion = kColorConversion601;
            }
        }
        else
        {
            _preferredConversion = kColorConversion709;
        }
    }
    else
    {
        if (isFullYUVRange)
        {
            _preferredConversion = kColorConversion601FullRange;
        }
        else
        {
            _preferredConversion = kColorConversion601;
        }
    }

    CMTime currentTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

    [GPUImageContext useImageProcessingContext];
    
     // 快速YUV紋理生成
    if ([GPUImageContext supportsFastTextureUpload] && captureAsYUV)
    {
        CVOpenGLESTextureRef luminanceTextureRef = NULL;
        CVOpenGLESTextureRef chrominanceTextureRef = NULL;

//        if (captureAsYUV && [GPUImageContext deviceSupportsRedTextures])
        if (CVPixelBufferGetPlaneCount(cameraFrame) > 0) // Check for YUV planar inputs to do RGB conversion
        {
            CVPixelBufferLockBaseAddress(cameraFrame, 0);
            
            if ( (imageBufferWidth != bufferWidth) && (imageBufferHeight != bufferHeight) )
            {
                imageBufferWidth = bufferWidth;
                imageBufferHeight = bufferHeight;
            }
            
            CVReturn err;
            // Y分量
            glActiveTexture(GL_TEXTURE4);
            if ([GPUImageContext deviceSupportsRedTextures])
            {
//                err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, coreVideoTextureCache, cameraFrame, NULL, GL_TEXTURE_2D, GL_RED_EXT, bufferWidth, bufferHeight, GL_RED_EXT, GL_UNSIGNED_BYTE, 0, &luminanceTextureRef);
                err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, [[GPUImageContext sharedImageProcessingContext] coreVideoTextureCache], cameraFrame, NULL, GL_TEXTURE_2D, GL_LUMINANCE, bufferWidth, bufferHeight, GL_LUMINANCE, GL_UNSIGNED_BYTE, 0, &luminanceTextureRef);
            }
            else
            {
                err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, [[GPUImageContext sharedImageProcessingContext] coreVideoTextureCache], cameraFrame, NULL, GL_TEXTURE_2D, GL_LUMINANCE, bufferWidth, bufferHeight, GL_LUMINANCE, GL_UNSIGNED_BYTE, 0, &luminanceTextureRef);
            }
            if (err)
            {
                NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
            }
            
            luminanceTexture = CVOpenGLESTextureGetName(luminanceTextureRef);
            glBindTexture(GL_TEXTURE_2D, luminanceTexture);
            glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
            glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
            
            // UV分量(Width/2 = Width/4 + Width/4)
            glActiveTexture(GL_TEXTURE5);
            if ([GPUImageContext deviceSupportsRedTextures])
            {
//                err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, coreVideoTextureCache, cameraFrame, NULL, GL_TEXTURE_2D, GL_RG_EXT, bufferWidth/2, bufferHeight/2, GL_RG_EXT, GL_UNSIGNED_BYTE, 1, &chrominanceTextureRef);
                err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, [[GPUImageContext sharedImageProcessingContext] coreVideoTextureCache], cameraFrame, NULL, GL_TEXTURE_2D, GL_LUMINANCE_ALPHA, bufferWidth/2, bufferHeight/2, GL_LUMINANCE_ALPHA, GL_UNSIGNED_BYTE, 1, &chrominanceTextureRef);
            }
            else
            {
                err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, [[GPUImageContext sharedImageProcessingContext] coreVideoTextureCache], cameraFrame, NULL, GL_TEXTURE_2D, GL_LUMINANCE_ALPHA, bufferWidth/2, bufferHeight/2, GL_LUMINANCE_ALPHA, GL_UNSIGNED_BYTE, 1, &chrominanceTextureRef);
            }
            if (err)
            {
                NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
            }
            
            chrominanceTexture = CVOpenGLESTextureGetName(chrominanceTextureRef);
            glBindTexture(GL_TEXTURE_2D, chrominanceTexture);
            glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
            glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
            
//            if (!allTargetsWantMonochromeData)
//            {
                [self convertYUVToRGBOutput];
//            }

            int rotatedImageBufferWidth = bufferWidth, rotatedImageBufferHeight = bufferHeight;
            
            if (GPUImageRotationSwapsWidthAndHeight(internalRotation))
            {
                rotatedImageBufferWidth = bufferHeight;
                rotatedImageBufferHeight = bufferWidth;
            }
            
            [self updateTargetsForVideoCameraUsingCacheTextureAtWidth:rotatedImageBufferWidth height:rotatedImageBufferHeight time:currentTime];
            
            CVPixelBufferUnlockBaseAddress(cameraFrame, 0);
            CFRelease(luminanceTextureRef);
            CFRelease(chrominanceTextureRef);
        }
        else
        {
            // TODO: Mesh this with the output framebuffer structure
            
//            CVPixelBufferLockBaseAddress(cameraFrame, 0);
//            
//            CVReturn err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, [[GPUImageContext sharedImageProcessingContext] coreVideoTextureCache], cameraFrame, NULL, GL_TEXTURE_2D, GL_RGBA, bufferWidth, bufferHeight, GL_BGRA, GL_UNSIGNED_BYTE, 0, &texture);
//            
//            if (!texture || err) {
//                NSLog(@"Camera CVOpenGLESTextureCacheCreateTextureFromImage failed (error: %d)", err);
//                NSAssert(NO, @"Camera failure");
//                return;
//            }
//            
//            outputTexture = CVOpenGLESTextureGetName(texture);
//            //        glBindTexture(CVOpenGLESTextureGetTarget(texture), outputTexture);
//            glBindTexture(GL_TEXTURE_2D, outputTexture);
//            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
//            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
//            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
//            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
//            
//            [self updateTargetsForVideoCameraUsingCacheTextureAtWidth:bufferWidth height:bufferHeight time:currentTime];
//
//            CVPixelBufferUnlockBaseAddress(cameraFrame, 0);
//            CFRelease(texture);
//
//            outputTexture = 0;
        }
        
        // 幀率
        if (_runBenchmark)
        {
            numberOfFramesCaptured++;
            if (numberOfFramesCaptured > INITIALFRAMESTOIGNOREFORBENCHMARK)
            {
                CFAbsoluteTime currentFrameTime = (CFAbsoluteTimeGetCurrent() - startTime);
                totalFrameTimeDuringCapture += currentFrameTime;
                NSLog(@"Average frame time : %f ms", [self averageFrameDurationDuringCapture]);
                NSLog(@"Current frame time : %f ms", 1000.0 * currentFrameTime);
            }
        }
    }
    else
    {
        // 鎖定基地址
        CVPixelBufferLockBaseAddress(cameraFrame, 0);
        
        // 獲取每行的字節(jié)寬度(width * 4)
        int bytesPerRow = (int) CVPixelBufferGetBytesPerRow(cameraFrame);
        // 獲取幀緩存
        outputFramebuffer = [[GPUImageContext sharedFramebufferCache] fetchFramebufferForSize:CGSizeMake(bytesPerRow / 4, bufferHeight) onlyTexture:YES];
        [outputFramebuffer activateFramebuffer];
        
        // 激活紋理
        glBindTexture(GL_TEXTURE_2D, [outputFramebuffer texture]);
        
        //        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, bufferWidth, bufferHeight, 0, GL_BGRA, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddress(cameraFrame));
        
        // Using BGRA extension to pull in video frame data directly
        // The use of bytesPerRow / 4 accounts for a display glitch present in preview video frames when using the photo preset on the camera

        // BGRA轉(zhuǎn)RGBA
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, bytesPerRow / 4, bufferHeight, 0, GL_BGRA, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddress(cameraFrame));
        
        [self updateTargetsForVideoCameraUsingCacheTextureAtWidth:bytesPerRow / 4 height:bufferHeight time:currentTime];
        
        CVPixelBufferUnlockBaseAddress(cameraFrame, 0);
        
        // 更新幀率
        if (_runBenchmark)
        {
            numberOfFramesCaptured++;
            if (numberOfFramesCaptured > INITIALFRAMESTOIGNOREFORBENCHMARK)
            {
                CFAbsoluteTime currentFrameTime = (CFAbsoluteTimeGetCurrent() - startTime);
                totalFrameTimeDuringCapture += currentFrameTime;
            }
        }
    }  
}

// 處理音頻
- (void)processAudioSampleBuffer:(CMSampleBufferRef)sampleBuffer;
{
    [self.audioEncodingTarget processAudioBuffer:sampleBuffer]; 
}

  • 注意
    由于 processAudioSampleBuffer 直接交給audioEncodingTarget處理AudioBuffer。因此沛申,錄制視頻的時候如果需要加入聲音需要設(shè)置audioEncodingTarget劣领。否則,錄制出的視頻就沒有聲音铁材。

  • 紋理格式與顏色映射

基本格式 | 紋素數(shù)據(jù)描述
:----:|:---:
GL_RED | (R, 0.0, 0.0, 1.0)
GL_RG | (R, G, 0.0, 1.0)
GL_RGB | (R, G, B, 1.0)
GL_RGBA | (R, G, B, A)
GL_LUMINANCE | (L, L, L, 1.0)
GL_LUMINANCE_ALPHA | (L, L, L, A)
GL_ALPH | (0.0, 0.0, 0.0, A)
通過上表尖淘,我們就知道為什么在GPUImage中,Y分量用GL_LUMINANCE內(nèi)部各式衫贬,和UV分量用GL_LUMINANCE_ALPHA內(nèi)部各式德澈。

GPUImageStillCamera

GPUImageStillCamera主要用來進(jìn)行拍照。它繼承自 GPUImageVideoCamera固惯,因此梆造,除了具備GPUImageVideoCamera的功能,它還提供了一套豐富的拍照API葬毫,方便我們進(jìn)行拍照的相關(guān)操作镇辉。

  • 屬性列表。GPUImageStillCamera屬性比較少贴捡,屬性也主要是與圖片相關(guān)忽肛。
// jpeg圖片的壓縮率,默認(rèn)是0.8
@property CGFloat jpegCompressionQuality;
// 圖片的Metadata信息
@property (readonly) NSDictionary *currentCaptureMetadata;
  • 方法列表烂斋。方法主要是和拍照相關(guān)屹逛,輸出的類型比較豐富,可以是CMSampleBuffer汛骂、UIImage罕模、NSData 等。如果有濾鏡需要帘瞭,還可以傳入相關(guān)濾鏡(final Filter In Chain)淑掌。
- (void)capturePhotoAsSampleBufferWithCompletionHandler:(void (^)(CMSampleBufferRef imageSampleBuffer, NSError *error))block;
- (void)capturePhotoAsImageProcessedUpToFilter:(GPUImageOutput<GPUImageInput> *)finalFilterInChain withCompletionHandler:(void (^)(UIImage *processedImage, NSError *error))block;
- (void)capturePhotoAsImageProcessedUpToFilter:(GPUImageOutput<GPUImageInput> *)finalFilterInChain withOrientation:(UIImageOrientation)orientation withCompletionHandler:(void (^)(UIImage *processedImage, NSError *error))block;
- (void)capturePhotoAsJPEGProcessedUpToFilter:(GPUImageOutput<GPUImageInput> *)finalFilterInChain withCompletionHandler:(void (^)(NSData *processedJPEG, NSError *error))block;
- (void)capturePhotoAsJPEGProcessedUpToFilter:(GPUImageOutput<GPUImageInput> *)finalFilterInChain withOrientation:(UIImageOrientation)orientation withCompletionHandler:(void (^)(NSData *processedJPEG, NSError *error))block;
- (void)capturePhotoAsPNGProcessedUpToFilter:(GPUImageOutput<GPUImageInput> *)finalFilterInChain withCompletionHandler:(void (^)(NSData *processedPNG, NSError *error))block;
- (void)capturePhotoAsPNGProcessedUpToFilter:(GPUImageOutput<GPUImageInput> *)finalFilterInChain withOrientation:(UIImageOrientation)orientation withCompletionHandler:(void (^)(NSData *processedPNG, NSError *error))block;

雖然API比較豐富,但是最終都是調(diào)用的私有方法 - (void)capturePhotoProcessedUpToFilter:(GPUImageOutput<GPUImageInput> *)finalFilterInChain withImageOnGPUHandler:(void (^)(NSError *error))block蝶念。因此抛腕,著重留意該方法就行了。

- (void)capturePhotoAsJPEGProcessedUpToFilter:(GPUImageOutput<GPUImageInput> *)finalFilterInChain withOrientation:(UIImageOrientation)orientation withCompletionHandler:(void (^)(NSData *processedImage, NSError *error))block {
    // 調(diào)用私有方法生成幀緩存對象
    [self capturePhotoProcessedUpToFilter:finalFilterInChain withImageOnGPUHandler:^(NSError *error) {
        NSData *dataForJPEGFile = nil;
        
        if(!error) {
            @autoreleasepool {
                // 讀取幀緩存并生成UIImage對象
                UIImage *filteredPhoto = [finalFilterInChain imageFromCurrentFramebufferWithOrientation:orientation];
                dispatch_semaphore_signal(frameRenderingSemaphore);
                
                // 由UIImage生成NSData對象
                dataForJPEGFile = UIImageJPEGRepresentation(filteredPhoto, self.jpegCompressionQuality);
            }
        } else {
            dispatch_semaphore_signal(frameRenderingSemaphore);
        }
        
        block(dataForJPEGFile, error);
    }];
}

- (void)capturePhotoProcessedUpToFilter:(GPUImageOutput<GPUImageInput> *)finalFilterInChain withImageOnGPUHandler:(void (^)(NSError *error))block
{
    // 等待計數(shù)器
    dispatch_semaphore_wait(frameRenderingSemaphore, DISPATCH_TIME_FOREVER);
    
    // 判斷是否捕獲圖像
    if(photoOutput.isCapturingStillImage){
        block([NSError errorWithDomain:AVFoundationErrorDomain code:AVErrorMaximumStillImageCaptureRequestsExceeded userInfo:nil]);
        return;
    }
    
    // 異步捕獲圖像
    [photoOutput captureStillImageAsynchronouslyFromConnection:[[photoOutput connections] objectAtIndex:0] completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
        if(imageSampleBuffer == NULL){
            block(error);
            return;
        }

        // For now, resize photos to fix within the max texture size of the GPU
        CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(imageSampleBuffer);
        
        // 獲取圖像大小
        CGSize sizeOfPhoto = CGSizeMake(CVPixelBufferGetWidth(cameraFrame), CVPixelBufferGetHeight(cameraFrame));
        CGSize scaledImageSizeToFitOnGPU = [GPUImageContext sizeThatFitsWithinATextureForSize:sizeOfPhoto];
        // 判斷時候需要調(diào)整大小
        if (!CGSizeEqualToSize(sizeOfPhoto, scaledImageSizeToFitOnGPU))
        {
            CMSampleBufferRef sampleBuffer = NULL;
            
            if (CVPixelBufferGetPlaneCount(cameraFrame) > 0)
            {
                NSAssert(NO, @"Error: no downsampling for YUV input in the framework yet");
            }
            else
            {
                // 圖像調(diào)整
                GPUImageCreateResizedSampleBuffer(cameraFrame, scaledImageSizeToFitOnGPU, &sampleBuffer);
            }

            dispatch_semaphore_signal(frameRenderingSemaphore);
            [finalFilterInChain useNextFrameForImageCapture];
            // 調(diào)用父類進(jìn)行圖片處理媒殉,生成幀緩存對象
            [self captureOutput:photoOutput didOutputSampleBuffer:sampleBuffer fromConnection:[[photoOutput connections] objectAtIndex:0]];
            dispatch_semaphore_wait(frameRenderingSemaphore, DISPATCH_TIME_FOREVER);
            if (sampleBuffer != NULL)
                CFRelease(sampleBuffer);
        }
        else
        {
            // This is a workaround for the corrupt images that are sometimes returned when taking a photo with the front camera and using the iOS 5.0 texture caches
            AVCaptureDevicePosition currentCameraPosition = [[videoInput device] position];
            if ( (currentCameraPosition != AVCaptureDevicePositionFront) || (![GPUImageContext supportsFastTextureUpload]) || !requiresFrontCameraTextureCacheCorruptionWorkaround)
            {
                dispatch_semaphore_signal(frameRenderingSemaphore);
                [finalFilterInChain useNextFrameForImageCapture];
                // 調(diào)用父類進(jìn)行圖片處理担敌,生成幀緩存對象
                [self captureOutput:photoOutput didOutputSampleBuffer:imageSampleBuffer fromConnection:[[photoOutput connections] objectAtIndex:0]];
                dispatch_semaphore_wait(frameRenderingSemaphore, DISPATCH_TIME_FOREVER);
            }
        }
        
        // 獲取圖像的metadata信息
        CFDictionaryRef metadata = CMCopyDictionaryOfAttachments(NULL, imageSampleBuffer, kCMAttachmentMode_ShouldPropagate);
        _currentCaptureMetadata = (__bridge_transfer NSDictionary *)metadata;

        block(nil);

        _currentCaptureMetadata = nil;
    }];
}

GPUImageMovieWriter

GPUImageMovieWriter主要的功能是編碼音視頻并保存為音視頻文件,它實現(xiàn)了GPUImageInput協(xié)議适袜。因此柄错,可以接受幀緩存的輸入。GPUImageMovieWriter 在進(jìn)行音視頻錄制的時候,主要用到這幾個類 AVAssetWriter 售貌、AVAssetWriterInput 给猾、AVAssetWriterInputPixelBufferAdaptor。AVAssetWriter支持的音視頻格式比較多颂跨,具體可以參考下面的表格:

定義 擴(kuò)展名
AVFileTypeQuickTimeMovie .mov 或 .qt
AVFileTypeMPEG4 .mp4
AVFileTypeAppleM4V .m4v
AVFileTypeAppleM4A .m4a
AVFileType3GPP .3gp 或 .3gpp 或 .sdv
AVFileType3GPP2 .3g2 或 .3gp2
AVFileTypeCoreAudioFormat .caf
AVFileTypeWAVE .wav 或 .wave 或 .bwf
AVFileTypeAIFF .aif 或 .aiff
AVFileTypeAIFC .aifc 或 .cdda
AVFileTypeAMR .amr
AVFileTypeWAVE .wav 或 .wave 或 .bwf
AVFileTypeMPEGLayer3 .mp3
AVFileTypeSunAU .au 或 .snd
AVFileTypeAC3 .ac3
AVFileTypeEnhancedAC3 .eac3
  • 屬性敢伸。GPUImageMovieWriter的屬性比較多,但是比較實用恒削。很多都是與音視頻處理狀態(tài)相關(guān)的池颈,比如:是保存音視頻、保存完成回調(diào)钓丰、失敗回調(diào)等躯砰。以下是一部分比較重要的屬性。
// 是否有音頻
@property(readwrite, nonatomic) BOOL hasAudioTrack;
// 是否不處理音頻
@property(readwrite, nonatomic) BOOL shouldPassthroughAudio;
// 標(biāo)記不被再次使用
@property(readwrite, nonatomic) BOOL shouldInvalidateAudioSampleWhenDone;
// 完成與失敗回調(diào)
@property(nonatomic, copy) void(^completionBlock)(void);
@property(nonatomic, copy) void(^failureBlock)(NSError*);
// 是否實時編碼視頻
@property(readwrite, nonatomic) BOOL encodingLiveVideo;
// 音視頻就緒回調(diào)
@property(nonatomic, copy) BOOL(^videoInputReadyCallback)(void);
@property(nonatomic, copy) BOOL(^audioInputReadyCallback)(void);
// 處理音頻回調(diào)
@property(nonatomic, copy) void(^audioProcessingCallback)(SInt16 **samplesRef, CMItemCount numSamplesInBuffer);
// 獲取AVAssetWriter
@property(nonatomic, readonly) AVAssetWriter *assetWriter;
// 獲取開始到前一幀的時長
@property(nonatomic, readonly) CMTime duration;
  • 初始化方法
- (id)initWithMovieURL:(NSURL *)newMovieURL size:(CGSize)newSize;
- (id)initWithMovieURL:(NSURL *)newMovieURL size:(CGSize)newSize fileType:(NSString *)newFileType outputSettings:(NSDictionary *)outputSettings;

初始化的時候主要涉及到:1携丁、初始化實例變量琢歇;2、創(chuàng)建OpenGL程序梦鉴;3李茫、初始化AVAssetWriter相關(guān)參數(shù),如:視頻編碼方式肥橙、視頻大小等魄宏。這里需要注意的是初始化的時候沒有初始化音頻相關(guān)參數(shù),如果需要處理音頻存筏,需使用 - (void)setHasAudioTrack:(BOOL)newValue 進(jìn)行相關(guān)設(shè)置宠互。

- (id)initWithMovieURL:(NSURL *)newMovieURL size:(CGSize)newSize;
{
    // 調(diào)用其它初始化方法
    return [self initWithMovieURL:newMovieURL size:newSize fileType:AVFileTypeQuickTimeMovie outputSettings:nil];
}

- (id)initWithMovieURL:(NSURL *)newMovieURL size:(CGSize)newSize fileType:(NSString *)newFileType outputSettings:(NSMutableDictionary *)outputSettings;
{
    if (!(self = [super init]))
    {
        return nil;
    }
    
    // 初始實例變量
    _shouldInvalidateAudioSampleWhenDone = NO;
    
    self.enabled = YES;
    alreadyFinishedRecording = NO;
    videoEncodingIsFinished = NO;
    audioEncodingIsFinished = NO;

    discont = NO;
    videoSize = newSize;
    movieURL = newMovieURL;
    fileType = newFileType;
    startTime = kCMTimeInvalid;
    _encodingLiveVideo = [[outputSettings objectForKey:@"EncodingLiveVideo"] isKindOfClass:[NSNumber class]] ? [[outputSettings objectForKey:@"EncodingLiveVideo"] boolValue] : YES;
    previousFrameTime = kCMTimeNegativeInfinity;
    previousAudioTime = kCMTimeNegativeInfinity;
    inputRotation = kGPUImageNoRotation;
    
    // 初始上下文對象
    _movieWriterContext = [[GPUImageContext alloc] init];
    [_movieWriterContext useSharegroup:[[[GPUImageContext sharedImageProcessingContext] context] sharegroup]];

    runSynchronouslyOnContextQueue(_movieWriterContext, ^{
        [_movieWriterContext useAsCurrentContext];
        
        // 初始化OpenGL程序
        if ([GPUImageContext supportsFastTextureUpload])
        {
            colorSwizzlingProgram = [_movieWriterContext programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImagePassthroughFragmentShaderString];
        }
        else
        {
            colorSwizzlingProgram = [_movieWriterContext programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageColorSwizzlingFragmentShaderString];
        }
        
        // 獲取glsl中的相關(guān)變量
        if (!colorSwizzlingProgram.initialized)
        {
            [colorSwizzlingProgram addAttribute:@"position"];
            [colorSwizzlingProgram addAttribute:@"inputTextureCoordinate"];
            
            if (![colorSwizzlingProgram link])
            {
                NSString *progLog = [colorSwizzlingProgram programLog];
                NSLog(@"Program link log: %@", progLog);
                NSString *fragLog = [colorSwizzlingProgram fragmentShaderLog];
                NSLog(@"Fragment shader compile log: %@", fragLog);
                NSString *vertLog = [colorSwizzlingProgram vertexShaderLog];
                NSLog(@"Vertex shader compile log: %@", vertLog);
                colorSwizzlingProgram = nil;
                NSAssert(NO, @"Filter shader link failed");
            }
        }        
        
        colorSwizzlingPositionAttribute = [colorSwizzlingProgram attributeIndex:@"position"];
        colorSwizzlingTextureCoordinateAttribute = [colorSwizzlingProgram attributeIndex:@"inputTextureCoordinate"];
        colorSwizzlingInputTextureUniform = [colorSwizzlingProgram uniformIndex:@"inputImageTexture"];
        
        [_movieWriterContext setContextShaderProgram:colorSwizzlingProgram];
        
        glEnableVertexAttribArray(colorSwizzlingPositionAttribute);
        glEnableVertexAttribArray(colorSwizzlingTextureCoordinateAttribute);
    });
        
    [self initializeMovieWithOutputSettings:outputSettings];

    return self;
}

- (void)initializeMovieWithOutputSettings:(NSDictionary *)outputSettings;
{
    isRecording = NO;
    
    self.enabled = YES;
    NSError *error = nil;
    // 初始化AVAssetWriter,傳入文件路徑和文件格式
    assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL fileType:fileType error:&error];
    
    // 處理初始化失敗回調(diào)
    if (error != nil)
    {
        NSLog(@"Error: %@", error);
        if (failureBlock) 
        {
            failureBlock(error);
        }
        else 
        {
            if(self.delegate && [self.delegate respondsToSelector:@selector(movieRecordingFailedWithError:)])
            {
                [self.delegate movieRecordingFailedWithError:error];
            }
        }
    }
    
    // Set this to make sure that a functional movie is produced, even if the recording is cut off mid-stream. Only the last second should be lost in that case.
    assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);
    
    // 設(shè)置視頻的寬高椭坚,以及編碼格式
    if (outputSettings == nil) 
    {
        NSMutableDictionary *settings = [[NSMutableDictionary alloc] init];
        [settings setObject:AVVideoCodecH264 forKey:AVVideoCodecKey];
        [settings setObject:[NSNumber numberWithInt:videoSize.width] forKey:AVVideoWidthKey];
        [settings setObject:[NSNumber numberWithInt:videoSize.height] forKey:AVVideoHeightKey];
        outputSettings = settings;
    }
    // 如果自己傳入了相關(guān)設(shè)置名秀,檢查設(shè)置中是否有必要的參數(shù)
    else 
    {
        __unused NSString *videoCodec = [outputSettings objectForKey:AVVideoCodecKey];
        __unused NSNumber *width = [outputSettings objectForKey:AVVideoWidthKey];
        __unused NSNumber *height = [outputSettings objectForKey:AVVideoHeightKey];
        
        NSAssert(videoCodec && width && height, @"OutputSettings is missing required parameters.");
        
        if( [outputSettings objectForKey:@"EncodingLiveVideo"] ) {
            NSMutableDictionary *tmp = [outputSettings mutableCopy];
            [tmp removeObjectForKey:@"EncodingLiveVideo"];
            outputSettings = tmp;
        }
    }
    
    /*
    NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                                [NSNumber numberWithInt:videoSize.width], AVVideoCleanApertureWidthKey,
                                                [NSNumber numberWithInt:videoSize.height], AVVideoCleanApertureHeightKey,
                                                [NSNumber numberWithInt:0], AVVideoCleanApertureHorizontalOffsetKey,
                                                [NSNumber numberWithInt:0], AVVideoCleanApertureVerticalOffsetKey,
                                                nil];

    NSDictionary *videoAspectRatioSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              [NSNumber numberWithInt:3], AVVideoPixelAspectRatioHorizontalSpacingKey,
                                              [NSNumber numberWithInt:3], AVVideoPixelAspectRatioVerticalSpacingKey,
                                              nil];

    NSMutableDictionary * compressionProperties = [[NSMutableDictionary alloc] init];
    [compressionProperties setObject:videoCleanApertureSettings forKey:AVVideoCleanApertureKey];
    [compressionProperties setObject:videoAspectRatioSettings forKey:AVVideoPixelAspectRatioKey];
    [compressionProperties setObject:[NSNumber numberWithInt: 2000000] forKey:AVVideoAverageBitRateKey];
    [compressionProperties setObject:[NSNumber numberWithInt: 16] forKey:AVVideoMaxKeyFrameIntervalKey];
    [compressionProperties setObject:AVVideoProfileLevelH264Main31 forKey:AVVideoProfileLevelKey];
    
    [outputSettings setObject:compressionProperties forKey:AVVideoCompressionPropertiesKey];
    */
    
    // 創(chuàng)建視頻的AVAssetWriterInput
    assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
    // 實時的數(shù)據(jù)處理,如果不設(shè)置可能有丟幀現(xiàn)象
    assetWriterVideoInput.expectsMediaDataInRealTime = _encodingLiveVideo;
    
    // You need to use BGRA for the video in order to get realtime encoding. I use a color-swizzling shader to line up glReadPixels' normal RGBA output with the movie input's BGRA.
    // 設(shè)置輸入到編碼器的像素格式
    NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
                                                           [NSNumber numberWithInt:videoSize.width], kCVPixelBufferWidthKey,
                                                           [NSNumber numberWithInt:videoSize.height], kCVPixelBufferHeightKey,
                                                           nil];
//    NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey,
//                                                           nil];
    // 創(chuàng)建AVAssetWriterInputPixelBufferAdaptor對象
    assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:assetWriterVideoInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
    
    [assetWriter addInput:assetWriterVideoInput];
}
  • 其它方法藕溅。
// 設(shè)置需要寫入音頻數(shù)據(jù)
- (void)setHasAudioTrack:(BOOL)hasAudioTrack audioSettings:(NSDictionary *)audioOutputSettings;

// 開始、結(jié)束继榆、取消錄制
- (void)startRecording;
- (void)startRecordingInOrientation:(CGAffineTransform)orientationTransform;
- (void)finishRecording;
- (void)finishRecordingWithCompletionHandler:(void (^)(void))handler;
- (void)cancelRecording;

// 處理音頻
- (void)processAudioBuffer:(CMSampleBufferRef)audioBuffer;

// 處理同步videoInputReadyCallback巾表、audioInputReadyCallback回調(diào)
- (void)enableSynchronizationCallbacks;

GPUImageMovieWriter 方法不是很多,但是方法都比較長略吨,內(nèi)部處理也相對比較復(fù)雜集币。這里只給出了常見的方法。如果需要錄制視頻翠忠,可以仔細(xì)閱讀GPUImageMovieWriter的源碼鞠苟。

// 初始話音頻參數(shù),如編碼格式、聲道數(shù)当娱、采樣率吃既、碼率
- (void)setHasAudioTrack:(BOOL)newValue audioSettings:(NSDictionary *)audioOutputSettings;
{
    _hasAudioTrack = newValue;
    
    if (_hasAudioTrack)
    {
        if (_shouldPassthroughAudio)
        {
            // Do not set any settings so audio will be the same as passthrough
            audioOutputSettings = nil;
        }
        else if (audioOutputSettings == nil)
        {
            AVAudioSession *sharedAudioSession = [AVAudioSession sharedInstance];
            double preferredHardwareSampleRate;
            
            if ([sharedAudioSession respondsToSelector:@selector(sampleRate)])
            {
                preferredHardwareSampleRate = [sharedAudioSession sampleRate];
            }
            else
            {
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
                preferredHardwareSampleRate = [[AVAudioSession sharedInstance] currentHardwareSampleRate];
#pragma clang diagnostic pop
            }
            
            AudioChannelLayout acl;
            bzero( &acl, sizeof(acl));
            acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
            
            audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                         [ NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey,
                                         [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
                                         [ NSNumber numberWithFloat: preferredHardwareSampleRate ], AVSampleRateKey,
                                         [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
                                         //[ NSNumber numberWithInt:AVAudioQualityLow], AVEncoderAudioQualityKey,
                                         [ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,
                                         nil];
/*
            AudioChannelLayout acl;
            bzero( &acl, sizeof(acl));
            acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
            
            audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   [ NSNumber numberWithInt: kAudioFormatMPEG4AAC ], AVFormatIDKey,
                                   [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
                                   [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
                                   [ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,
                                   [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
                                   nil];*/
        }
        
        assetWriterAudioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioOutputSettings];
        [assetWriter addInput:assetWriterAudioInput];
        assetWriterAudioInput.expectsMediaDataInRealTime = _encodingLiveVideo;
    }
    else
    {
        // Remove audio track if it exists
    }
}

- (void)finishRecordingWithCompletionHandler:(void (^)(void))handler;
{
    runSynchronouslyOnContextQueue(_movieWriterContext, ^{
        isRecording = NO;
        
        if (assetWriter.status == AVAssetWriterStatusCompleted || assetWriter.status == AVAssetWriterStatusCancelled || assetWriter.status == AVAssetWriterStatusUnknown)
        {
            if (handler)
                runAsynchronouslyOnContextQueue(_movieWriterContext, handler);
            return;
        }
        if( assetWriter.status == AVAssetWriterStatusWriting && ! videoEncodingIsFinished )
        {
            videoEncodingIsFinished = YES;
            [assetWriterVideoInput markAsFinished];
        }
        if( assetWriter.status == AVAssetWriterStatusWriting && ! audioEncodingIsFinished )
        {
            audioEncodingIsFinished = YES;
            [assetWriterAudioInput markAsFinished];
        }
#if (!defined(__IPHONE_6_0) || (__IPHONE_OS_VERSION_MAX_ALLOWED < __IPHONE_6_0))
        // Not iOS 6 SDK
        [assetWriter finishWriting];
        if (handler)
            runAsynchronouslyOnContextQueue(_movieWriterContext,handler);
#else
        // iOS 6 SDK
        if ([assetWriter respondsToSelector:@selector(finishWritingWithCompletionHandler:)]) {
            // Running iOS 6
            [assetWriter finishWritingWithCompletionHandler:(handler ?: ^{ })];
        }
        else {
            // Not running iOS 6
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
            [assetWriter finishWriting];
#pragma clang diagnostic pop
            if (handler)
                runAsynchronouslyOnContextQueue(_movieWriterContext, handler);
        }
#endif
    });
}

// 處理音頻數(shù)據(jù)
- (void)processAudioBuffer:(CMSampleBufferRef)audioBuffer;
{
    if (!isRecording || _paused)
    {
        return;
    }
    
//    if (_hasAudioTrack && CMTIME_IS_VALID(startTime))
    // 有音頻數(shù)據(jù)才處理
    if (_hasAudioTrack)
    {
        CFRetain(audioBuffer);
        // 獲取音頻的pts
        CMTime currentSampleTime = CMSampleBufferGetOutputPresentationTimeStamp(audioBuffer);
        
        if (CMTIME_IS_INVALID(startTime))
        {
            runSynchronouslyOnContextQueue(_movieWriterContext, ^{
                // 判斷assetWriter的狀態(tài),不是寫入狀態(tài)則開始寫
                if ((audioInputReadyCallback == NULL) && (assetWriter.status != AVAssetWriterStatusWriting))
                {
                    [assetWriter startWriting];
                }
                // 設(shè)置pts
                [assetWriter startSessionAtSourceTime:currentSampleTime];
                startTime = currentSampleTime;
            });
        }
        
         // 判斷需不需要再使用audioBuffer跨细,如果不需要使用則設(shè)置為invalidate狀態(tài)
        if (!assetWriterAudioInput.readyForMoreMediaData && _encodingLiveVideo)
        {
            NSLog(@"1: Had to drop an audio frame: %@", CFBridgingRelease(CMTimeCopyDescription(kCFAllocatorDefault, currentSampleTime)));
            if (_shouldInvalidateAudioSampleWhenDone)
            {
                CMSampleBufferInvalidate(audioBuffer);
            }
            CFRelease(audioBuffer);
            return;
        }
        
        if (discont) {
            discont = NO;
            
            CMTime current;
            if (offsetTime.value > 0) {
                current = CMTimeSubtract(currentSampleTime, offsetTime);
            } else {
                current = currentSampleTime;
            }
            
            CMTime offset = CMTimeSubtract(current, previousAudioTime);
            
            if (offsetTime.value == 0) {
                offsetTime = offset;
            } else {
                offsetTime = CMTimeAdd(offsetTime, offset);
            }
        }
        
        if (offsetTime.value > 0) {
            CFRelease(audioBuffer);
            audioBuffer = [self adjustTime:audioBuffer by:offsetTime];
            CFRetain(audioBuffer);
        }
        
        // record most recent time so we know the length of the pause
        currentSampleTime = CMSampleBufferGetPresentationTimeStamp(audioBuffer);

        previousAudioTime = currentSampleTime;
        
        //if the consumer wants to do something with the audio samples before writing, let him.
        // 如果處理音頻需要回調(diào)鹦倚,則回調(diào)
        if (self.audioProcessingCallback) {
            //need to introspect into the opaque CMBlockBuffer structure to find its raw sample buffers.
            CMBlockBufferRef buffer = CMSampleBufferGetDataBuffer(audioBuffer);
            CMItemCount numSamplesInBuffer = CMSampleBufferGetNumSamples(audioBuffer);
            AudioBufferList audioBufferList;
            
            CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(audioBuffer,
                                                                    NULL,
                                                                    &audioBufferList,
                                                                    sizeof(audioBufferList),
                                                                    NULL,
                                                                    NULL,
                                                                    kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,
                                                                    &buffer
                                                                    );
            //passing a live pointer to the audio buffers, try to process them in-place or we might have syncing issues.
            for (int bufferCount=0; bufferCount < audioBufferList.mNumberBuffers; bufferCount++) {
                SInt16 *samples = (SInt16 *)audioBufferList.mBuffers[bufferCount].mData;
                self.audioProcessingCallback(&samples, numSamplesInBuffer);
            }
        }
        
//        NSLog(@"Recorded audio sample time: %lld, %d, %lld", currentSampleTime.value, currentSampleTime.timescale, currentSampleTime.epoch);
        // 寫入音頻block
        void(^write)() = ^() {
            // 如不能append 則等待
            while( ! assetWriterAudioInput.readyForMoreMediaData && ! _encodingLiveVideo && ! audioEncodingIsFinished ) {
                NSDate *maxDate = [NSDate dateWithTimeIntervalSinceNow:0.5];
                //NSLog(@"audio waiting...");
                [[NSRunLoop currentRunLoop] runUntilDate:maxDate];
            }
            if (!assetWriterAudioInput.readyForMoreMediaData)
            {
                NSLog(@"2: Had to drop an audio frame %@", CFBridgingRelease(CMTimeCopyDescription(kCFAllocatorDefault, currentSampleTime)));
            }
            // 當(dāng)readyForMoreMediaData為YES的時候才可以追加數(shù)據(jù)
            else if(assetWriter.status == AVAssetWriterStatusWriting)
            {
                if (![assetWriterAudioInput appendSampleBuffer:audioBuffer])
                    NSLog(@"Problem appending audio buffer at time: %@", CFBridgingRelease(CMTimeCopyDescription(kCFAllocatorDefault, currentSampleTime)));
            }
            else
            {
                //NSLog(@"Wrote an audio frame %@", CFBridgingRelease(CMTimeCopyDescription(kCFAllocatorDefault, currentSampleTime)));
            }
            // 標(biāo)記不被使用
            if (_shouldInvalidateAudioSampleWhenDone)
            {
                CMSampleBufferInvalidate(audioBuffer);
            }
            CFRelease(audioBuffer);
        };
//        runAsynchronouslyOnContextQueue(_movieWriterContext, write);
        // 如果需要編碼視頻,則派發(fā)到相同隊列中執(zhí)行
        if( _encodingLiveVideo )

        {
            runAsynchronouslyOnContextQueue(_movieWriterContext, write);
        }
        else
        {
            // 否則冀惭,直接寫入
            write();
        }
    }
}

GPUImageMovie

GPUImageMovie 主要的作用是讀取與解碼音視頻文件震叙。它繼承自GPUImageOutput,可以輸出幀緩存對象散休,由于沒有實現(xiàn)GPUImageInput協(xié)議媒楼,因此只能作為響應(yīng)源。

  • 初始化戚丸』罚可以通過NSURL、AVPlayerItem昏滴、AVAsset初始化猴鲫。
- (id)initWithAsset:(AVAsset *)asset;
- (id)initWithPlayerItem:(AVPlayerItem *)playerItem;
- (id)initWithURL:(NSURL *)url;

初始化相對簡單,只是簡單保存?zhèn)魅霐?shù)據(jù)谣殊。

- (id)initWithURL:(NSURL *)url;
{
    if (!(self = [super init])) 
    {
        return nil;
    }

    [self yuvConversionSetup];

    self.url = url;
    self.asset = nil;

    return self;
}

- (id)initWithAsset:(AVAsset *)asset;
{
    if (!(self = [super init])) 
    {
      return nil;
    }
    
    [self yuvConversionSetup];

    self.url = nil;
    self.asset = asset;

    return self;
}

- (id)initWithPlayerItem:(AVPlayerItem *)playerItem;
{
    if (!(self = [super init]))
    {
        return nil;
    }

    [self yuvConversionSetup];

    self.url = nil;
    self.asset = nil;
    self.playerItem = playerItem;

    return self;
}
  • 其它方法拂共。GPUImageMovie方法比較少,但是代碼比較多相對比較復(fù)雜姻几。由于篇幅有限宜狐,這里不再細(xì)看。它的方法主要分為這幾類:1蛇捌、讀取音視頻數(shù)據(jù)抚恒;2、讀取的控制(開始络拌、暫停俭驮、取消);3春贸、處理音視頻數(shù)據(jù)幀混萝。
// 允許使用GPUImageMovieWriter進(jìn)行音視頻同步編碼
- (void)enableSynchronizedEncodingUsingMovieWriter:(GPUImageMovieWriter *)movieWriter;
// 讀取音視頻
- (BOOL)readNextVideoFrameFromOutput:(AVAssetReaderOutput *)readerVideoTrackOutput;
- (BOOL)readNextAudioSampleFromOutput:(AVAssetReaderOutput *)readerAudioTrackOutput;
// 開始、結(jié)束萍恕、取消讀取
- (void)startProcessing;
- (void)endProcessing;
- (void)cancelProcessing;
// 處理視頻幀
- (void)processMovieFrame:(CMSampleBufferRef)movieSampleBuffer; 

實現(xiàn)過程

  • 錄制視頻
#import "ViewController.h"
#import <GPUImage.h>

#define DOCUMENT(path) [[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject] stringByAppendingPathComponent:path]

@interface ViewController ()
@property (weak, nonatomic) IBOutlet GPUImageView *imageView;
@property (strong, nonatomic) GPUImageVideoCamera *video;
@property (strong, nonatomic) GPUImageMovieWriter *writer;
@property (nonatomic, strong) NSURL *videoFile;
@property (nonatomic, readonly, getter=isRecording) BOOL recording;
@end

@implementation ViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    
    _recording = NO;
    
    // 設(shè)置背景色
    [_imageView setBackgroundColorRed:1.0 green:1.0 blue:1.0 alpha:1.0];

    // 設(shè)置保存文件路徑
    _videoFile = [NSURL fileURLWithPath:DOCUMENT(@"/1.mov")];
    
    // 刪除文件
    [[NSFileManager defaultManager] removeItemAtURL:_videoFile error:nil];
    
    // 設(shè)置GPUImageMovieWriter
    _writer = [[GPUImageMovieWriter alloc] initWithMovieURL:_videoFile size:CGSizeMake(480, 640)];
    [_writer setHasAudioTrack:YES audioSettings:nil];
    
    // 設(shè)置GPUImageVideoCamera
    _video = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
    _video.outputImageOrientation = UIInterfaceOrientationPortrait;
    [_video addAudioInputsAndOutputs];
    
    // 設(shè)置音頻處理Target
    _video.audioEncodingTarget = _writer;

    // 設(shè)置Target
    [_video addTarget:_imageView];
    [_video addTarget:_writer];
    
    // 開始拍攝
    [_video startCameraCapture];
}

- (IBAction)startButtonTapped:(UIButton *)sender
{
    if (!_recording) {
        // 開始錄制視頻
        [_writer startRecording];
        _recording = YES;
    }
}

- (IBAction)finishButtonTapped:(UIButton *)sender
{
    // 結(jié)束錄制
    [_writer finishRecording];
}
@end
  • 拍照
#import "SecondViewController.h"
#import "ImageShowViewController.h"
#import <GPUImage.h>

@interface SecondViewController ()
@property (weak, nonatomic) IBOutlet GPUImageView *imageView;
@property (nonatomic, strong) GPUImageStillCamera *camera;
@property (nonatomic, strong)  GPUImageFilter *filter;
@end

@implementation SecondViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    
    // 設(shè)置背景色
    [_imageView setBackgroundColorRed:1.0 green:1.0 blue:1.0 alpha:1.0];

    // 濾鏡
    _filter = [[GPUImageGrayscaleFilter alloc] init];
    
    // 初始化
    _camera = [[GPUImageStillCamera alloc] initWithSessionPreset:AVCaptureSessionPresetPhoto cameraPosition:AVCaptureDevicePositionBack];
    _camera.outputImageOrientation = UIInterfaceOrientationPortrait;
    
    [_camera addTarget:_filter];
    [_filter addTarget:_imageView];
    
    // 開始運(yùn)行
    [_camera startCameraCapture];
}

- (IBAction)pictureButtonTapped:(UIButton *)sender
{
    if ([_camera isRunning]) {
        [_camera capturePhotoAsImageProcessedUpToFilter:_filter withCompletionHandler:^(UIImage *processedImage, NSError *error) {
            [_camera stopCameraCapture];
            
            ImageShowViewController *imageShowVC = [[UIStoryboard storyboardWithName:@"Main" bundle:nil] instantiateViewControllerWithIdentifier:@"ImageShowViewController"];
            imageShowVC.image = processedImage;
            [self presentViewController:imageShowVC animated:YES completion:NULL];
        }];
    }else {
        [_camera startCameraCapture];
    }
}
  • 視頻轉(zhuǎn)碼與濾鏡逸嘀。由于需要將aac解碼為pcm,因此我修改了GPUImageMovie.m源碼的230行( 為了解決這個報錯 [AVAssetWriterInput appendSampleBuffer:] Cannot append sample buffer: Input buffer must be in an uncompressed format when outputSettings is not nil)允粤,如下所示:
NSDictionary *audioOutputSetting = @{
                                     AVFormatIDKey : @(kAudioFormatLinearPCM)
                                     };

// This might need to be extended to handle movies with more than one audio track
AVAssetTrack* audioTrack = [audioTracks objectAtIndex:0];
readerAudioTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:audioOutputSetting];
#import "ThirdViewController.h"
#import <GPUImage.h>

#define DOCUMENT(path) [[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject] stringByAppendingPathComponent:path]

@interface ThirdViewController ()
@property (weak, nonatomic) IBOutlet GPUImageView *imageView;
@property (nonatomic, strong) GPUImageMovie *movie;
@property (nonatomic, strong) GPUImageMovieWriter *movieWriter;
@property (nonatomic, strong) GPUImageFilter *filter;
@property (nonatomic, assign) CGSize size;
@end

@implementation ThirdViewController

- (void)viewDidLoad
{
    [super viewDidLoad];
    
    // 獲取文件路徑
    NSURL *fileURL = [[NSBundle mainBundle] URLForResource:@"1.mp4" withExtension:nil];
    AVAsset *asset = [AVAsset assetWithURL:fileURL];
    
    // 獲取視頻寬高
    NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
    AVAssetTrack *videoTrack = [tracks firstObject];
    _size = videoTrack.naturalSize;

    // 初始化GPUImageMovie
    _movie = [[GPUImageMovie alloc] initWithAsset:asset];
    
    // 濾鏡
    _filter = [[GPUImageGrayscaleFilter alloc] init];
    
    [_movie addTarget:_filter];
    [_filter addTarget:_imageView];
}

- (IBAction)playButtonTapped:(UIButton *)sender
{
    [_movie startProcessing];
}

- (IBAction)transcodeButtonTapped:(id)sender
{
    // 文件路徑
    NSURL *videoFile = [NSURL fileURLWithPath:DOCUMENT(@"/2.mov")];
    [[NSFileManager defaultManager] removeItemAtURL:videoFile error:nil];
    
    // GPUImageMovieWriter
    _movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:videoFile size:_size];
    [_movieWriter setHasAudioTrack:YES audioSettings:nil];
    
    // GPUImageMovie相關(guān)設(shè)置
    _movie.audioEncodingTarget = _movieWriter;
    [_filter addTarget:_movieWriter];
    [_movie enableSynchronizedEncodingUsingMovieWriter:_movieWriter];
    
    // 開始轉(zhuǎn)碼
    [_movieWriter startRecording];
    [_movie startProcessing];
    
    // 結(jié)束
    __weak typeof(_movieWriter) wMovieWriter = _movieWriter;
    __weak typeof(self) wSelf = self;
    [_movieWriter setCompletionBlock:^{
        [wMovieWriter finishRecording];
        [wSelf.movie removeTarget:wMovieWriter];
        wSelf.movie.audioEncodingTarget = nil;
    }];
}

總結(jié)

GPUImageVideoCamera崭倘、GPUImageStillCamera翼岁、GPUImageMovieWriter、GPUImageMovie 這幾個類在處理相機(jī)司光、音視頻的時候非常有用琅坡,由于篇幅限制,不能全部講解它們的源碼飘庄。如果有需要可以自己好好閱讀脑蠕。

源碼地址:GPUImage源碼閱讀系列 https://github.com/QinminiOS/GPUImage
系列文章地址:GPUImage源碼閱讀 http://www.reibang.com/nb/11749791

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末,一起剝皮案震驚了整個濱河市跪削,隨后出現(xiàn)的幾起案子谴仙,更是在濱河造成了極大的恐慌,老刑警劉巖碾盐,帶你破解...
    沈念sama閱讀 217,277評論 6 503
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件晃跺,死亡現(xiàn)場離奇詭異,居然都是意外死亡毫玖,警方通過查閱死者的電腦和手機(jī)掀虎,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 92,689評論 3 393
  • 文/潘曉璐 我一進(jìn)店門,熙熙樓的掌柜王于貴愁眉苦臉地迎上來付枫,“玉大人烹玉,你說我怎么就攤上這事〔玻” “怎么了二打?”我有些...
    開封第一講書人閱讀 163,624評論 0 353
  • 文/不壞的土叔 我叫張陵,是天一觀的道長掂榔。 經(jīng)常有香客問我继效,道長,這世上最難降的妖魔是什么装获? 我笑而不...
    開封第一講書人閱讀 58,356評論 1 293
  • 正文 為了忘掉前任瑞信,我火速辦了婚禮,結(jié)果婚禮上穴豫,老公的妹妹穿的比我還像新娘凡简。我一直安慰自己,他們只是感情好精肃,可當(dāng)我...
    茶點故事閱讀 67,402評論 6 392
  • 文/花漫 我一把揭開白布潘鲫。 她就那樣靜靜地躺著,像睡著了一般肋杖。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上挖函,一...
    開封第一講書人閱讀 51,292評論 1 301
  • 那天状植,我揣著相機(jī)與錄音浊竟,去河邊找鬼。 笑死津畸,一個胖子當(dāng)著我的面吹牛振定,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播肉拓,決...
    沈念sama閱讀 40,135評論 3 418
  • 文/蒼蘭香墨 我猛地睜開眼后频,長吁一口氣:“原來是場噩夢啊……” “哼!你這毒婦竟也來了暖途?” 一聲冷哼從身側(cè)響起卑惜,我...
    開封第一講書人閱讀 38,992評論 0 275
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎驻售,沒想到半個月后露久,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 45,429評論 1 314
  • 正文 獨居荒郊野嶺守林人離奇死亡欺栗,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點故事閱讀 37,636評論 3 334
  • 正文 我和宋清朗相戀三年毫痕,在試婚紗的時候發(fā)現(xiàn)自己被綠了。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片迟几。...
    茶點故事閱讀 39,785評論 1 348
  • 序言:一個原本活蹦亂跳的男人離奇死亡消请,死狀恐怖,靈堂內(nèi)的尸體忽然破棺而出类腮,到底是詐尸還是另有隱情臊泰,我是刑警寧澤,帶...
    沈念sama閱讀 35,492評論 5 345
  • 正文 年R本政府宣布存哲,位于F島的核電站因宇,受9級特大地震影響,放射性物質(zhì)發(fā)生泄漏祟偷。R本人自食惡果不足惜察滑,卻給世界環(huán)境...
    茶點故事閱讀 41,092評論 3 328
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望修肠。 院中可真熱鬧贺辰,春花似錦、人聲如沸嵌施。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,723評論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽吗伤。三九已至吃靠,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間足淆,已是汗流浹背巢块。 一陣腳步聲響...
    開封第一講書人閱讀 32,858評論 1 269
  • 我被黑心中介騙來泰國打工礁阁, 沒想到剛下飛機(jī)就差點兒被人妖公主榨干…… 1. 我叫王不留,地道東北人族奢。 一個月前我還...
    沈念sama閱讀 47,891評論 2 370
  • 正文 我出身青樓姥闭,卻偏偏與公主長得像,于是被迫代替她去往敵國和親越走。 傳聞我的和親對象是個殘疾皇子棚品,可洞房花燭夜當(dāng)晚...
    茶點故事閱讀 44,713評論 2 354

推薦閱讀更多精彩內(nèi)容