GPUImage詳解

概述

GPUImage是一個著名的圖像處理開源庫,它讓你能夠在圖片、視頻步责、相機(jī)上使用GPU加速的濾鏡和其它特效挺尾。與CoreImage框架相比鹅搪,可以根據(jù)GPUImage提供的接口,使用自定義的濾鏡遭铺。項(xiàng)目地址:https://github.com/BradLarson/GPUImage

這篇文章主要是閱讀GPUImage框架中的GPUImageVideoCamera丽柿、GPUImageStillCamera恢准、GPUImageMovieWriter、GPUImageMovie這幾個類的源碼甫题。在介紹?GPUImage源碼閱讀(四)的時候馁筐,數(shù)據(jù)源主要來自于圖片與UI的渲染,這篇文章主要介紹數(shù)據(jù)源來自相機(jī)坠非、音視頻文件的情況敏沉。同樣的顯示畫面會用到上次介紹的GPUImageView,這里還會用GPUImageMovieWriter保存錄制的音視頻文件炎码。以下是源碼內(nèi)容:

GPUImageVideoCamera

GPUImageStillCamera

GPUImageMovieWriter

GPUImageMovie

實(shí)現(xiàn)效果

錄制視頻

錄制視頻.gif

拍照

拍照.png

視頻轉(zhuǎn)碼與濾鏡

原視頻.gif

濾鏡處理后的視頻.gif

GPUImageVideoCamera

GPUImageVideoCamera繼承自GPUImageOutput盟迟,實(shí)現(xiàn)了?AVCaptureVideoDataOutputSampleBufferDelegate?和?AVCaptureAudioDataOutputSampleBufferDelegate協(xié)議。GPUImageVideoCamera可以調(diào)用相機(jī)進(jìn)行視頻拍攝潦闲,拍攝之后會生成幀緩存對象攒菠,我們可以使用GPUImageView顯示,也可以使用GPUImageMovieWriter保存為視頻文件歉闰。同時也提供了GPUImageVideoCameraDelegate?代理辖众,方便我們自己處理CMSampleBuffer。在處理視頻的時候會涉及到以下概念:

kCVPixelFormatType_32BGRA

kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange

kCVPixelFormatType_420YpCbCr8BiPlanarFullRange

這幾種像素格式和敬,由于在之前在?OpenGL ES入門11-相機(jī)視頻渲染?做了詳細(xì)的介紹凹炸,這里便不多介紹,如有需要請參考之前的文章昼弟。

屬性列表啤它。屬性中大多數(shù)是與相機(jī)相關(guān)的相關(guān)參數(shù)。

// AVCaptureSession是否在運(yùn)行@property(readonly,nonatomic)BOOL isRunning;// AVCaptureSession對象@property(readonly,retain,nonatomic)AVCaptureSession*captureSession;// 視頻輸出的質(zhì)量私杜、大小的控制參數(shù)蚕键。如:AVCaptureSessionPreset640x480@property(readwrite,nonatomic,copy)NSString*captureSessionPreset;// 視頻的幀率@property(readwrite)int32_t frameRate;// 正在使用哪個相機(jī)@property(readonly,getter=isFrontFacingCameraPresent)BOOL frontFacingCameraPresent;@property(readonly,getter=isBackFacingCameraPresent)BOOL backFacingCameraPresent;// 實(shí)時日志輸出@property(readwrite,nonatomic)BOOL runBenchmark;// 正在使用的相機(jī)對象,方便設(shè)置參數(shù)@property(readonly)AVCaptureDevice*inputCamera;// 輸出圖片的方向@property(readwrite,nonatomic)UIInterfaceOrientation outputImageOrientation;// 前置相機(jī)水平鏡像@property(readwrite,nonatomic)BOOL horizontallyMirrorFrontFacingCamera,horizontallyMirrorRearFacingCamera;// GPUImageVideoCameraDelegate代理@property(nonatomic,assign)id<GPUImageVideoCameraDelegate>delegate;

初始化方法衰粹。

-(id)initWithSessionPreset:(NSString*)sessionPreset cameraPosition:(AVCaptureDevicePosition)cameraPosition;

GPUImageVideoCamera初始化方法比較少锣光,初始化的時候需要傳遞視頻質(zhì)量以及使用哪個相機(jī)。如果直接調(diào)用?- (instancetype)init?則會使用?AVCaptureSessionPreset640x480?和?AVCaptureDevicePositionBack?來初始化铝耻。

-(id)initWithSessionPreset:(NSString*)sessionPreset cameraPosition:(AVCaptureDevicePosition)cameraPosition;{if(!(self=[superinit])){returnnil;}// 創(chuàng)建音視頻處理隊(duì)列cameraProcessingQueue=dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH,0);audioProcessingQueue=dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_LOW,0);// 創(chuàng)建信號量frameRenderingSemaphore=dispatch_semaphore_create(1);// 變量的初始化_frameRate=0;// This will not set frame rate unless this value gets set to 1 or above_runBenchmark=NO;capturePaused=NO;outputRotation=kGPUImageNoRotation;internalRotation=kGPUImageNoRotation;captureAsYUV=YES;_preferredConversion=kColorConversion709;// 根據(jù)傳入?yún)?shù)獲取前后相機(jī)_inputCamera=nil;NSArray*devices=[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];for(AVCaptureDevice*deviceindevices){if([device position]==cameraPosition){_inputCamera=device;}}// 獲取相機(jī)失敗誊爹,立即返回if(!_inputCamera){returnnil;}// 創(chuàng)建會話對象_captureSession=[[AVCaptureSession alloc]init];// 開始配置[_captureSession beginConfiguration];// 創(chuàng)建video輸入對象NSError*error=nil;videoInput=[[AVCaptureDeviceInput alloc]initWithDevice:_inputCamera error:&error];if([_captureSession canAddInput:videoInput]){[_captureSession addInput:videoInput];}// 創(chuàng)建video輸出對象videoOutput=[[AVCaptureVideoDataOutput alloc]init];[videoOutput setAlwaysDiscardsLateVideoFrames:NO];//? ? if (captureAsYUV && [GPUImageContext deviceSupportsRedTextures])// 設(shè)置YUV的處理方式if(captureAsYUV&&[GPUImageContext supportsFastTextureUpload]){BOOL supportsFullYUVRange=NO;NSArray*supportedPixelFormats=videoOutput.availableVideoCVPixelFormatTypes;for(NSNumber*currentPixelFormatinsupportedPixelFormats){if([currentPixelFormat intValue]==kCVPixelFormatType_420YpCbCr8BiPlanarFullRange){supportsFullYUVRange=YES;}}if(supportsFullYUVRange){// 設(shè)置kCVPixelFormatType_420YpCbCr8BiPlanarFullRange格式[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]forKey:(id)kCVPixelBufferPixelFormatTypeKey]];isFullYUVRange=YES;}else{// 設(shè)置kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange格式[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]forKey:(id)kCVPixelBufferPixelFormatTypeKey]];isFullYUVRange=NO;}}else{// 設(shè)置kCVPixelFormatType_32BGRA格式[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]forKey:(id)kCVPixelBufferPixelFormatTypeKey]];}// 創(chuàng)建GL程序、獲取屬性位置runSynchronouslyOnVideoProcessingQueue(^{if(captureAsYUV){[GPUImageContext useImageProcessingContext];//? ? ? ? ? ? if ([GPUImageContext deviceSupportsRedTextures])//? ? ? ? ? ? {//? ? ? ? ? ? ? ? yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVVideoRangeConversionForRGFragmentShaderString];//? ? ? ? ? ? }//? ? ? ? ? ? else//? ? ? ? ? ? {if(isFullYUVRange){yuvConversionProgram=[[GPUImageContext sharedImageProcessingContext]programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVFullRangeConversionForLAFragmentShaderString];}else{yuvConversionProgram=[[GPUImageContext sharedImageProcessingContext]programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVVideoRangeConversionForLAFragmentShaderString];}//? ? ? ? ? ? }if(!yuvConversionProgram.initialized){[yuvConversionProgram addAttribute:@"position"];[yuvConversionProgram addAttribute:@"inputTextureCoordinate"];if(![yuvConversionProgram link]){NSString*progLog=[yuvConversionProgram programLog];NSLog(@"Program link log: %@",progLog);NSString*fragLog=[yuvConversionProgram fragmentShaderLog];NSLog(@"Fragment shader compile log: %@",fragLog);NSString*vertLog=[yuvConversionProgram vertexShaderLog];NSLog(@"Vertex shader compile log: %@",vertLog);yuvConversionProgram=nil;NSAssert(NO,@"Filter shader link failed");}}yuvConversionPositionAttribute=[yuvConversionProgram attributeIndex:@"position"];yuvConversionTextureCoordinateAttribute=[yuvConversionProgram attributeIndex:@"inputTextureCoordinate"];yuvConversionLuminanceTextureUniform=[yuvConversionProgram uniformIndex:@"luminanceTexture"];yuvConversionChrominanceTextureUniform=[yuvConversionProgram uniformIndex:@"chrominanceTexture"];yuvConversionMatrixUniform=[yuvConversionProgram uniformIndex:@"colorConversionMatrix"];[GPUImageContext setActiveShaderProgram:yuvConversionProgram];glEnableVertexAttribArray(yuvConversionPositionAttribute);glEnableVertexAttribArray(yuvConversionTextureCoordinateAttribute);}});// 設(shè)置AVCaptureVideoDataOutputSampleBufferDelegate代理[videoOutput setSampleBufferDelegate:selfqueue:cameraProcessingQueue];// 添加輸出if([_captureSession canAddOutput:videoOutput]){[_captureSession addOutput:videoOutput];}else{NSLog(@"Couldn't add video output");returnnil;}// 設(shè)置視頻質(zhì)量_captureSessionPreset=sessionPreset;[_captureSession setSessionPreset:_captureSessionPreset];// This will let you get 60 FPS video from the 720p preset on an iPhone 4S, but only that device and that preset//? ? AVCaptureConnection *conn = [videoOutput connectionWithMediaType:AVMediaTypeVideo];//? ? //? ? if (conn.supportsVideoMinFrameDuration)//? ? ? ? conn.videoMinFrameDuration = CMTimeMake(1,60);//? ? if (conn.supportsVideoMaxFrameDuration)//? ? ? ? conn.videoMaxFrameDuration = CMTimeMake(1,60);// 提交配置[_captureSession commitConfiguration];returnself;}

其它方法瓢捉。GPUImageVideoCamera方法大致分為這幾類:1频丘、添加輸入輸出設(shè)備;2泡态、捕獲視頻搂漠;3、處理音視頻某弦;4桐汤、相機(jī)參數(shù)設(shè)置而克;

// 添加、移除音頻輸入輸出-(BOOL)addAudioInputsAndOutputs;-(BOOL)removeAudioInputsAndOutputs;// 移除所有輸入輸出-(void)removeInputsAndOutputs;// 開始怔毛、關(guān)閉员萍、暫停、恢復(fù)相機(jī)捕獲-(void)startCameraCapture;-(void)stopCameraCapture;-(void)pauseCameraCapture;-(void)resumeCameraCapture;// 處理音視頻-(void)processVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer;-(void)processAudioSampleBuffer:(CMSampleBufferRef)sampleBuffer;// 獲取相機(jī)相關(guān)參數(shù)-(AVCaptureDevicePosition)cameraPosition;-(AVCaptureConnection*)videoCaptureConnection;+(BOOL)isBackFacingCameraPresent;+(BOOL)isFrontFacingCameraPresent;// 變換相機(jī)-(void)rotateCamera;// 獲取平均幀率-(CGFloat)averageFrameDurationDuringCapture;// 重置相關(guān)基準(zhǔn)-(void)resetBenchmarkAverage;

雖然GPUImageVideoCamera方法比較多拣度,但是內(nèi)部的邏輯不是很復(fù)雜碎绎。

// 增加音頻輸入輸出-(BOOL)addAudioInputsAndOutputs{if(audioOutput)returnNO;[_captureSession beginConfiguration];_microphone=[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];audioInput=[AVCaptureDeviceInput deviceInputWithDevice:_microphone error:nil];if([_captureSession canAddInput:audioInput]){[_captureSession addInput:audioInput];}audioOutput=[[AVCaptureAudioDataOutput alloc]init];if([_captureSession canAddOutput:audioOutput]){[_captureSession addOutput:audioOutput];}else{NSLog(@"Couldn't add audio output");}[audioOutput setSampleBufferDelegate:selfqueue:audioProcessingQueue];[_captureSession commitConfiguration];returnYES;}// 移除音頻輸入輸出-(BOOL)removeAudioInputsAndOutputs{if(!audioOutput)returnNO;[_captureSession beginConfiguration];[_captureSession removeInput:audioInput];[_captureSession removeOutput:audioOutput];audioInput=nil;audioOutput=nil;_microphone=nil;[_captureSession commitConfiguration];returnYES;}// 移除所有輸入輸出-(void)removeInputsAndOutputs;{[_captureSession beginConfiguration];if(videoInput){[_captureSession removeInput:videoInput];[_captureSession removeOutput:videoOutput];videoInput=nil;videoOutput=nil;}if(_microphone!=nil){[_captureSession removeInput:audioInput];[_captureSession removeOutput:audioOutput];audioInput=nil;audioOutput=nil;_microphone=nil;}[_captureSession commitConfiguration];}// 開始捕獲-(void)startCameraCapture;{if(![_captureSession isRunning]){startingCaptureTime=[NSDate date];[_captureSession startRunning];};}// 停止捕獲-(void)stopCameraCapture;{if([_captureSession isRunning]){[_captureSession stopRunning];}}// 暫停捕獲-(void)pauseCameraCapture;{capturePaused=YES;}// 恢復(fù)捕獲-(void)resumeCameraCapture;{capturePaused=NO;}// 處理視頻-(void)processVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer;{if(capturePaused){return;}CFAbsoluteTime startTime=CFAbsoluteTimeGetCurrent();CVImageBufferRef cameraFrame=CMSampleBufferGetImageBuffer(sampleBuffer);// 獲取視頻寬高intbufferWidth=(int)CVPixelBufferGetWidth(cameraFrame);intbufferHeight=(int)CVPixelBufferGetHeight(cameraFrame);CFTypeRef colorAttachments=CVBufferGetAttachment(cameraFrame,kCVImageBufferYCbCrMatrixKey,NULL);if(colorAttachments!=NULL){if(CFStringCompare(colorAttachments,kCVImageBufferYCbCrMatrix_ITU_R_601_4,0)==kCFCompareEqualTo){if(isFullYUVRange){_preferredConversion=kColorConversion601FullRange;}else{_preferredConversion=kColorConversion601;}}else{_preferredConversion=kColorConversion709;}}else{if(isFullYUVRange){_preferredConversion=kColorConversion601FullRange;}else{_preferredConversion=kColorConversion601;}}CMTime currentTime=CMSampleBufferGetPresentationTimeStamp(sampleBuffer);[GPUImageContext useImageProcessingContext];// 快速YUV紋理生成if([GPUImageContext supportsFastTextureUpload]&&captureAsYUV){CVOpenGLESTextureRef luminanceTextureRef=NULL;CVOpenGLESTextureRef chrominanceTextureRef=NULL;//? ? ? ? if (captureAsYUV && [GPUImageContext deviceSupportsRedTextures])if(CVPixelBufferGetPlaneCount(cameraFrame)>0)// Check for YUV planar inputs to do RGB conversion{CVPixelBufferLockBaseAddress(cameraFrame,0);if((imageBufferWidth!=bufferWidth)&&(imageBufferHeight!=bufferHeight)){imageBufferWidth=bufferWidth;imageBufferHeight=bufferHeight;}CVReturn err;// Y分量glActiveTexture(GL_TEXTURE4);if([GPUImageContext deviceSupportsRedTextures]){//? ? ? ? ? ? ? ? err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, coreVideoTextureCache, cameraFrame, NULL, GL_TEXTURE_2D, GL_RED_EXT, bufferWidth, bufferHeight, GL_RED_EXT, GL_UNSIGNED_BYTE, 0, &luminanceTextureRef);err=CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,[[GPUImageContext sharedImageProcessingContext]coreVideoTextureCache],cameraFrame,NULL,GL_TEXTURE_2D,GL_LUMINANCE,bufferWidth,bufferHeight,GL_LUMINANCE,GL_UNSIGNED_BYTE,0,&luminanceTextureRef);}else{err=CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,[[GPUImageContext sharedImageProcessingContext]coreVideoTextureCache],cameraFrame,NULL,GL_TEXTURE_2D,GL_LUMINANCE,bufferWidth,bufferHeight,GL_LUMINANCE,GL_UNSIGNED_BYTE,0,&luminanceTextureRef);}if(err){NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d",err);}luminanceTexture=CVOpenGLESTextureGetName(luminanceTextureRef);glBindTexture(GL_TEXTURE_2D,luminanceTexture);glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP_TO_EDGE);// UV分量(Width/2 = Width/4 + Width/4)glActiveTexture(GL_TEXTURE5);if([GPUImageContext deviceSupportsRedTextures]){//? ? ? ? ? ? ? ? err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, coreVideoTextureCache, cameraFrame, NULL, GL_TEXTURE_2D, GL_RG_EXT, bufferWidth/2, bufferHeight/2, GL_RG_EXT, GL_UNSIGNED_BYTE, 1, &chrominanceTextureRef);err=CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,[[GPUImageContext sharedImageProcessingContext]coreVideoTextureCache],cameraFrame,NULL,GL_TEXTURE_2D,GL_LUMINANCE_ALPHA,bufferWidth/2,bufferHeight/2,GL_LUMINANCE_ALPHA,GL_UNSIGNED_BYTE,1,&chrominanceTextureRef);}else{err=CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,[[GPUImageContext sharedImageProcessingContext]coreVideoTextureCache],cameraFrame,NULL,GL_TEXTURE_2D,GL_LUMINANCE_ALPHA,bufferWidth/2,bufferHeight/2,GL_LUMINANCE_ALPHA,GL_UNSIGNED_BYTE,1,&chrominanceTextureRef);}if(err){NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d",err);}chrominanceTexture=CVOpenGLESTextureGetName(chrominanceTextureRef);glBindTexture(GL_TEXTURE_2D,chrominanceTexture);glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP_TO_EDGE);//? ? ? ? ? ? if (!allTargetsWantMonochromeData)//? ? ? ? ? ? {[selfconvertYUVToRGBOutput];//? ? ? ? ? ? }introtatedImageBufferWidth=bufferWidth,rotatedImageBufferHeight=bufferHeight;if(GPUImageRotationSwapsWidthAndHeight(internalRotation)){rotatedImageBufferWidth=bufferHeight;rotatedImageBufferHeight=bufferWidth;}[selfupdateTargetsForVideoCameraUsingCacheTextureAtWidth:rotatedImageBufferWidth height:rotatedImageBufferHeight time:currentTime];CVPixelBufferUnlockBaseAddress(cameraFrame,0);CFRelease(luminanceTextureRef);CFRelease(chrominanceTextureRef);}else{// TODO: Mesh this with the output framebuffer structure//? ? ? ? ? ? CVPixelBufferLockBaseAddress(cameraFrame, 0);//? ? ? ? ? ? //? ? ? ? ? ? CVReturn err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, [[GPUImageContext sharedImageProcessingContext] coreVideoTextureCache], cameraFrame, NULL, GL_TEXTURE_2D, GL_RGBA, bufferWidth, bufferHeight, GL_BGRA, GL_UNSIGNED_BYTE, 0, &texture);//? ? ? ? ? ? //? ? ? ? ? ? if (!texture || err) {//? ? ? ? ? ? ? ? NSLog(@"Camera CVOpenGLESTextureCacheCreateTextureFromImage failed (error: %d)", err);//? ? ? ? ? ? ? ? NSAssert(NO, @"Camera failure");//? ? ? ? ? ? ? ? return;//? ? ? ? ? ? }//? ? ? ? ? ? //? ? ? ? ? ? outputTexture = CVOpenGLESTextureGetName(texture);//? ? ? ? ? ? //? ? ? ? glBindTexture(CVOpenGLESTextureGetTarget(texture), outputTexture);//? ? ? ? ? ? glBindTexture(GL_TEXTURE_2D, outputTexture);//? ? ? ? ? ? glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);//? ? ? ? ? ? glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);//? ? ? ? ? ? glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);//? ? ? ? ? ? glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);//? ? ? ? ? ? //? ? ? ? ? ? [self updateTargetsForVideoCameraUsingCacheTextureAtWidth:bufferWidth height:bufferHeight time:currentTime];////? ? ? ? ? ? CVPixelBufferUnlockBaseAddress(cameraFrame, 0);//? ? ? ? ? ? CFRelease(texture);////? ? ? ? ? ? outputTexture = 0;}// 幀率if(_runBenchmark){numberOfFramesCaptured++;if(numberOfFramesCaptured>INITIALFRAMESTOIGNOREFORBENCHMARK){CFAbsoluteTime currentFrameTime=(CFAbsoluteTimeGetCurrent()-startTime);totalFrameTimeDuringCapture+=currentFrameTime;NSLog(@"Average frame time : %f ms",[selfaverageFrameDurationDuringCapture]);NSLog(@"Current frame time : %f ms",1000.0*currentFrameTime);}}}else{// 鎖定基地址CVPixelBufferLockBaseAddress(cameraFrame,0);// 獲取每行的字節(jié)寬度(width * 4)intbytesPerRow=(int)CVPixelBufferGetBytesPerRow(cameraFrame);// 獲取幀緩存outputFramebuffer=[[GPUImageContext sharedFramebufferCache]fetchFramebufferForSize:CGSizeMake(bytesPerRow/4,bufferHeight)onlyTexture:YES];[outputFramebuffer activateFramebuffer];// 激活紋理glBindTexture(GL_TEXTURE_2D,[outputFramebuffer texture]);//? ? ? ? glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, bufferWidth, bufferHeight, 0, GL_BGRA, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddress(cameraFrame));// Using BGRA extension to pull in video frame data directly// The use of bytesPerRow / 4 accounts for a display glitch present in preview video frames when using the photo preset on the camera// BGRA轉(zhuǎn)RGBAglTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,bytesPerRow/4,bufferHeight,0,GL_BGRA,GL_UNSIGNED_BYTE,CVPixelBufferGetBaseAddress(cameraFrame));[selfupdateTargetsForVideoCameraUsingCacheTextureAtWidth:bytesPerRow/4height:bufferHeight time:currentTime];CVPixelBufferUnlockBaseAddress(cameraFrame,0);// 更新幀率if(_runBenchmark){numberOfFramesCaptured++;if(numberOfFramesCaptured>INITIALFRAMESTOIGNOREFORBENCHMARK){CFAbsoluteTime currentFrameTime=(CFAbsoluteTimeGetCurrent()-startTime);totalFrameTimeDuringCapture+=currentFrameTime;}}}}// 處理音頻-(void)processAudioSampleBuffer:(CMSampleBufferRef)sampleBuffer;{[self.audioEncodingTarget processAudioBuffer:sampleBuffer];}

注意

由于?processAudioSampleBuffer?直接交給audioEncodingTarget處理AudioBuffer。因此抗果,錄制視頻的時候如果需要加入聲音需要設(shè)置audioEncodingTarget筋帖。否則,錄制出的視頻就沒有聲音冤馏。

紋理格式與顏色映射

基本格式 | 紋素數(shù)據(jù)描述

:----:|:---:

GL_RED | (R, 0.0, 0.0, 1.0)

GL_RG | (R, G, 0.0, 1.0)

GL_RGB | (R, G, B, 1.0)

GL_RGBA | (R, G, B, A)

GL_LUMINANCE | (L, L, L, 1.0)

GL_LUMINANCE_ALPHA | (L, L, L, A)

GL_ALPH | (0.0, 0.0, 0.0, A)

通過上表幕随,我們就知道為什么在GPUImage中,Y分量用GL_LUMINANCE內(nèi)部各式宿接,和UV分量用GL_LUMINANCE_ALPHA內(nèi)部各式。

GPUImageStillCamera

GPUImageStillCamera主要用來進(jìn)行拍照辕录。它繼承自?GPUImageVideoCamera睦霎,因此,除了具備GPUImageVideoCamera的功能走诞,它還提供了一套豐富的拍照API副女,方便我們進(jìn)行拍照的相關(guān)操作。

屬性列表蚣旱。GPUImageStillCamera屬性比較少碑幅,屬性也主要是與圖片相關(guān)。

// jpeg圖片的壓縮率塞绿,默認(rèn)是0.8@propertyCGFloat jpegCompressionQuality;// 圖片的Metadata信息@property(readonly)NSDictionary*currentCaptureMetadata;

方法列表沟涨。方法主要是和拍照相關(guān),輸出的類型比較豐富异吻,可以是CMSampleBuffer裹赴、UIImage、NSData 等诀浪。如果有濾鏡需要棋返,還可以傳入相關(guān)濾鏡(final Filter In Chain)。

-(void)capturePhotoAsSampleBufferWithCompletionHandler:(void(^)(CMSampleBufferRef imageSampleBuffer,NSError*error))block;-(void)capturePhotoAsImageProcessedUpToFilter:(GPUImageOutput<GPUImageInput>*)finalFilterInChain withCompletionHandler:(void(^)(UIImage*processedImage,NSError*error))block;-(void)capturePhotoAsImageProcessedUpToFilter:(GPUImageOutput<GPUImageInput>*)finalFilterInChain withOrientation:(UIImageOrientation)orientation withCompletionHandler:(void(^)(UIImage*processedImage,NSError*error))block;-(void)capturePhotoAsJPEGProcessedUpToFilter:(GPUImageOutput<GPUImageInput>*)finalFilterInChain withCompletionHandler:(void(^)(NSData*processedJPEG,NSError*error))block;-(void)capturePhotoAsJPEGProcessedUpToFilter:(GPUImageOutput<GPUImageInput>*)finalFilterInChain withOrientation:(UIImageOrientation)orientation withCompletionHandler:(void(^)(NSData*processedJPEG,NSError*error))block;-(void)capturePhotoAsPNGProcessedUpToFilter:(GPUImageOutput<GPUImageInput>*)finalFilterInChain withCompletionHandler:(void(^)(NSData*processedPNG,NSError*error))block;-(void)capturePhotoAsPNGProcessedUpToFilter:(GPUImageOutput<GPUImageInput>*)finalFilterInChain withOrientation:(UIImageOrientation)orientation withCompletionHandler:(void(^)(NSData*processedPNG,NSError*error))block;

雖然API比較豐富雷猪,但是最終都是調(diào)用的私有方法?- (void)capturePhotoProcessedUpToFilter:(GPUImageOutput<GPUImageInput> *)finalFilterInChain withImageOnGPUHandler:(void (^)(NSError *error))block睛竣。因此,著重留意該方法就行了求摇。

-(void)capturePhotoAsJPEGProcessedUpToFilter:(GPUImageOutput<GPUImageInput>*)finalFilterInChain withOrientation:(UIImageOrientation)orientation withCompletionHandler:(void(^)(NSData*processedImage,NSError*error))block{// 調(diào)用私有方法生成幀緩存對象[selfcapturePhotoProcessedUpToFilter:finalFilterInChain withImageOnGPUHandler:^(NSError*error){NSData*dataForJPEGFile=nil;if(!error){@autoreleasepool{// 讀取幀緩存并生成UIImage對象UIImage*filteredPhoto=[finalFilterInChain imageFromCurrentFramebufferWithOrientation:orientation];dispatch_semaphore_signal(frameRenderingSemaphore);// 由UIImage生成NSData對象dataForJPEGFile=UIImageJPEGRepresentation(filteredPhoto,self.jpegCompressionQuality);}}else{dispatch_semaphore_signal(frameRenderingSemaphore);}block(dataForJPEGFile,error);}];}-(void)capturePhotoProcessedUpToFilter:(GPUImageOutput<GPUImageInput>*)finalFilterInChain withImageOnGPUHandler:(void(^)(NSError*error))block{// 等待計(jì)數(shù)器dispatch_semaphore_wait(frameRenderingSemaphore,DISPATCH_TIME_FOREVER);// 判斷是否捕獲圖像if(photoOutput.isCapturingStillImage){block([NSError errorWithDomain:AVFoundationErrorDomain code:AVErrorMaximumStillImageCaptureRequestsExceeded userInfo:nil]);return;}// 異步捕獲圖像[photoOutput captureStillImageAsynchronouslyFromConnection:[[photoOutput connections]objectAtIndex:0]completionHandler:^(CMSampleBufferRef imageSampleBuffer,NSError*error){if(imageSampleBuffer==NULL){block(error);return;}// For now, resize photos to fix within the max texture size of the GPUCVImageBufferRef cameraFrame=CMSampleBufferGetImageBuffer(imageSampleBuffer);// 獲取圖像大小CGSize sizeOfPhoto=CGSizeMake(CVPixelBufferGetWidth(cameraFrame),CVPixelBufferGetHeight(cameraFrame));CGSize scaledImageSizeToFitOnGPU=[GPUImageContext sizeThatFitsWithinATextureForSize:sizeOfPhoto];// 判斷時候需要調(diào)整大小if(!CGSizeEqualToSize(sizeOfPhoto,scaledImageSizeToFitOnGPU)){CMSampleBufferRef sampleBuffer=NULL;if(CVPixelBufferGetPlaneCount(cameraFrame)>0){NSAssert(NO,@"Error: no downsampling for YUV input in the framework yet");}else{// 圖像調(diào)整GPUImageCreateResizedSampleBuffer(cameraFrame,scaledImageSizeToFitOnGPU,&sampleBuffer);}dispatch_semaphore_signal(frameRenderingSemaphore);[finalFilterInChain useNextFrameForImageCapture];// 調(diào)用父類進(jìn)行圖片處理射沟,生成幀緩存對象[selfcaptureOutput:photoOutput didOutputSampleBuffer:sampleBuffer fromConnection:[[photoOutput connections]objectAtIndex:0]];dispatch_semaphore_wait(frameRenderingSemaphore,DISPATCH_TIME_FOREVER);if(sampleBuffer!=NULL)CFRelease(sampleBuffer);}else{// This is a workaround for the corrupt images that are sometimes returned when taking a photo with the front camera and using the iOS 5.0 texture cachesAVCaptureDevicePosition currentCameraPosition=[[videoInput device]position];if((currentCameraPosition!=AVCaptureDevicePositionFront)||(![GPUImageContext supportsFastTextureUpload])||!requiresFrontCameraTextureCacheCorruptionWorkaround){dispatch_semaphore_signal(frameRenderingSemaphore);[finalFilterInChain useNextFrameForImageCapture];// 調(diào)用父類進(jìn)行圖片處理殊者,生成幀緩存對象[selfcaptureOutput:photoOutput didOutputSampleBuffer:imageSampleBuffer fromConnection:[[photoOutput connections]objectAtIndex:0]];dispatch_semaphore_wait(frameRenderingSemaphore,DISPATCH_TIME_FOREVER);}}// 獲取圖像的metadata信息CFDictionaryRef metadata=CMCopyDictionaryOfAttachments(NULL,imageSampleBuffer,kCMAttachmentMode_ShouldPropagate);_currentCaptureMetadata=(__bridge_transfer NSDictionary*)metadata;block(nil);_currentCaptureMetadata=nil;}];}

GPUImageMovieWriter

GPUImageMovieWriter主要的功能是編碼音視頻并保存為音視頻文件,它實(shí)現(xiàn)了GPUImageInput協(xié)議躏惋。因此幽污,可以接受幀緩存的輸入。GPUImageMovieWriter 在進(jìn)行音視頻錄制的時候簿姨,主要用到這幾個類?AVAssetWriter?距误、AVAssetWriterInput、AVAssetWriterInputPixelBufferAdaptor扁位。AVAssetWriter支持的音視頻格式比較多准潭,具體可以參考下面的表格:

定義擴(kuò)展名

AVFileTypeQuickTimeMovie.mov 或 .qt

AVFileTypeMPEG4.mp4

AVFileTypeAppleM4V.m4v

AVFileTypeAppleM4A.m4a

AVFileType3GPP.3gp 或 .3gpp 或 .sdv

AVFileType3GPP2.3g2 或 .3gp2

AVFileTypeCoreAudioFormat.caf

AVFileTypeWAVE.wav 或 .wave 或 .bwf

AVFileTypeAIFF.aif 或 .aiff

AVFileTypeAIFC.aifc 或 .cdda

AVFileTypeAMR.amr

AVFileTypeWAVE.wav 或 .wave 或 .bwf

AVFileTypeMPEGLayer3.mp3

AVFileTypeSunAU.au 或 .snd

AVFileTypeAC3.ac3

AVFileTypeEnhancedAC3.eac3

屬性。GPUImageMovieWriter的屬性比較多域仇,但是比較實(shí)用刑然。很多都是與音視頻處理狀態(tài)相關(guān)的,比如:是保存音視頻暇务、保存完成回調(diào)泼掠、失敗回調(diào)等。以下是一部分比較重要的屬性垦细。

// 是否有音頻@property(readwrite,nonatomic)BOOL hasAudioTrack;// 是否不處理音頻@property(readwrite,nonatomic)BOOL shouldPassthroughAudio;// 標(biāo)記不被再次使用@property(readwrite,nonatomic)BOOL shouldInvalidateAudioSampleWhenDone;// 完成與失敗回調(diào)@property(nonatomic,copy)void(^completionBlock)(void);@property(nonatomic,copy)void(^failureBlock)(NSError*);// 是否實(shí)時編碼視頻@property(readwrite,nonatomic)BOOL encodingLiveVideo;// 音視頻就緒回調(diào)@property(nonatomic,copy)BOOL(^videoInputReadyCallback)(void);@property(nonatomic,copy)BOOL(^audioInputReadyCallback)(void);// 處理音頻回調(diào)@property(nonatomic,copy)void(^audioProcessingCallback)(SInt16**samplesRef,CMItemCount numSamplesInBuffer);// 獲取AVAssetWriter@property(nonatomic,readonly)AVAssetWriter*assetWriter;// 獲取開始到前一幀的時長@property(nonatomic,readonly)CMTime duration;

初始化方法

-(id)initWithMovieURL:(NSURL*)newMovieURL size:(CGSize)newSize;-(id)initWithMovieURL:(NSURL*)newMovieURL size:(CGSize)newSize fileType:(NSString*)newFileType outputSettings:(NSDictionary*)outputSettings;

初始化的時候主要涉及到:1择镇、初始化實(shí)例變量;2括改、創(chuàng)建OpenGL程序腻豌;3、初始化AVAssetWriter相關(guān)參數(shù)嘱能,如:視頻編碼方式吝梅、視頻大小等。這里需要注意的是初始化的時候沒有初始化音頻相關(guān)參數(shù)惹骂,如果需要處理音頻苏携,需使用?- (void)setHasAudioTrack:(BOOL)newValue進(jìn)行相關(guān)設(shè)置。

-(id)initWithMovieURL:(NSURL*)newMovieURL size:(CGSize)newSize;{// 調(diào)用其它初始化方法return[selfinitWithMovieURL:newMovieURL size:newSize fileType:AVFileTypeQuickTimeMovie outputSettings:nil];}-(id)initWithMovieURL:(NSURL*)newMovieURL size:(CGSize)newSize fileType:(NSString*)newFileType outputSettings:(NSMutableDictionary*)outputSettings;{if(!(self=[superinit])){returnnil;}// 初始實(shí)例變量_shouldInvalidateAudioSampleWhenDone=NO;self.enabled=YES;alreadyFinishedRecording=NO;videoEncodingIsFinished=NO;audioEncodingIsFinished=NO;discont=NO;videoSize=newSize;movieURL=newMovieURL;fileType=newFileType;startTime=kCMTimeInvalid;_encodingLiveVideo=[[outputSettings objectForKey:@"EncodingLiveVideo"]isKindOfClass:[NSNumber class]]?[[outputSettings objectForKey:@"EncodingLiveVideo"]boolValue]:YES;previousFrameTime=kCMTimeNegativeInfinity;previousAudioTime=kCMTimeNegativeInfinity;inputRotation=kGPUImageNoRotation;// 初始上下文對象_movieWriterContext=[[GPUImageContext alloc]init];[_movieWriterContext useSharegroup:[[[GPUImageContext sharedImageProcessingContext]context]sharegroup]];runSynchronouslyOnContextQueue(_movieWriterContext,^{[_movieWriterContext useAsCurrentContext];// 初始化OpenGL程序if([GPUImageContext supportsFastTextureUpload]){colorSwizzlingProgram=[_movieWriterContext programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImagePassthroughFragmentShaderString];}else{colorSwizzlingProgram=[_movieWriterContext programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageColorSwizzlingFragmentShaderString];}// 獲取glsl中的相關(guān)變量if(!colorSwizzlingProgram.initialized){[colorSwizzlingProgram addAttribute:@"position"];[colorSwizzlingProgram addAttribute:@"inputTextureCoordinate"];if(![colorSwizzlingProgram link]){NSString*progLog=[colorSwizzlingProgram programLog];NSLog(@"Program link log: %@",progLog);NSString*fragLog=[colorSwizzlingProgram fragmentShaderLog];NSLog(@"Fragment shader compile log: %@",fragLog);NSString*vertLog=[colorSwizzlingProgram vertexShaderLog];NSLog(@"Vertex shader compile log: %@",vertLog);colorSwizzlingProgram=nil;NSAssert(NO,@"Filter shader link failed");}}colorSwizzlingPositionAttribute=[colorSwizzlingProgram attributeIndex:@"position"];colorSwizzlingTextureCoordinateAttribute=[colorSwizzlingProgram attributeIndex:@"inputTextureCoordinate"];colorSwizzlingInputTextureUniform=[colorSwizzlingProgram uniformIndex:@"inputImageTexture"];[_movieWriterContext setContextShaderProgram:colorSwizzlingProgram];glEnableVertexAttribArray(colorSwizzlingPositionAttribute);glEnableVertexAttribArray(colorSwizzlingTextureCoordinateAttribute);});[selfinitializeMovieWithOutputSettings:outputSettings];returnself;}-(void)initializeMovieWithOutputSettings:(NSDictionary*)outputSettings;{isRecording=NO;self.enabled=YES;NSError*error=nil;// 初始化AVAssetWriter析苫,傳入文件路徑和文件格式assetWriter=[[AVAssetWriter alloc]initWithURL:movieURL fileType:fileType error:&error];// 處理初始化失敗回調(diào)if(error!=nil){NSLog(@"Error: %@",error);if(failureBlock){failureBlock(error);}else{if(self.delegate&&[self.delegate respondsToSelector:@selector(movieRecordingFailedWithError:)]){[self.delegate movieRecordingFailedWithError:error];}}}// Set this to make sure that a functional movie is produced, even if the recording is cut off mid-stream. Only the last second should be lost in that case.assetWriter.movieFragmentInterval=CMTimeMakeWithSeconds(1.0,1000);// 設(shè)置視頻的寬高兜叨,以及編碼格式if(outputSettings==nil){NSMutableDictionary*settings=[[NSMutableDictionary alloc]init];[settings setObject:AVVideoCodecH264 forKey:AVVideoCodecKey];[settings setObject:[NSNumber numberWithInt:videoSize.width]forKey:AVVideoWidthKey];[settings setObject:[NSNumber numberWithInt:videoSize.height]forKey:AVVideoHeightKey];outputSettings=settings;}// 如果自己傳入了相關(guān)設(shè)置,檢查設(shè)置中是否有必要的參數(shù)else{__unused NSString*videoCodec=[outputSettings objectForKey:AVVideoCodecKey];__unused NSNumber*width=[outputSettings objectForKey:AVVideoWidthKey];__unused NSNumber*height=[outputSettings objectForKey:AVVideoHeightKey];NSAssert(videoCodec&&width&&height,@"OutputSettings is missing required parameters.");if([outputSettings objectForKey:@"EncodingLiveVideo"]){NSMutableDictionary*tmp=[outputSettings mutableCopy];[tmp removeObjectForKey:@"EncodingLiveVideo"];outputSettings=tmp;}}/*

? ? NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys:

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? [NSNumber numberWithInt:videoSize.width], AVVideoCleanApertureWidthKey,

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? [NSNumber numberWithInt:videoSize.height], AVVideoCleanApertureHeightKey,

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? [NSNumber numberWithInt:0], AVVideoCleanApertureHorizontalOffsetKey,

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? [NSNumber numberWithInt:0], AVVideoCleanApertureVerticalOffsetKey,

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? nil];

? ? NSDictionary *videoAspectRatioSettings = [NSDictionary dictionaryWithObjectsAndKeys:

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? [NSNumber numberWithInt:3], AVVideoPixelAspectRatioHorizontalSpacingKey,

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? [NSNumber numberWithInt:3], AVVideoPixelAspectRatioVerticalSpacingKey,

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? nil];

? ? NSMutableDictionary * compressionProperties = [[NSMutableDictionary alloc] init];

? ? [compressionProperties setObject:videoCleanApertureSettings forKey:AVVideoCleanApertureKey];

? ? [compressionProperties setObject:videoAspectRatioSettings forKey:AVVideoPixelAspectRatioKey];

? ? [compressionProperties setObject:[NSNumber numberWithInt: 2000000] forKey:AVVideoAverageBitRateKey];

? ? [compressionProperties setObject:[NSNumber numberWithInt: 16] forKey:AVVideoMaxKeyFrameIntervalKey];

? ? [compressionProperties setObject:AVVideoProfileLevelH264Main31 forKey:AVVideoProfileLevelKey];


? ? [outputSettings setObject:compressionProperties forKey:AVVideoCompressionPropertiesKey];

? ? */// 創(chuàng)建視頻的AVAssetWriterInputassetWriterVideoInput=[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];// 實(shí)時的數(shù)據(jù)處理衩侥,如果不設(shè)置可能有丟幀現(xiàn)象assetWriterVideoInput.expectsMediaDataInRealTime=_encodingLiveVideo;// You need to use BGRA for the video in order to get realtime encoding. I use a color-swizzling shader to line up glReadPixels' normal RGBA output with the movie input's BGRA.// 設(shè)置輸入到編碼器的像素格式NSDictionary*sourcePixelBufferAttributesDictionary=[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA],kCVPixelBufferPixelFormatTypeKey,[NSNumber numberWithInt:videoSize.width],kCVPixelBufferWidthKey,[NSNumber numberWithInt:videoSize.height],kCVPixelBufferHeightKey,nil];//? ? NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey,//? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? nil];// 創(chuàng)建AVAssetWriterInputPixelBufferAdaptor對象assetWriterPixelBufferInput=[AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:assetWriterVideoInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];[assetWriter addInput:assetWriterVideoInput];}

其它方法国旷。

// 設(shè)置需要寫入音頻數(shù)據(jù)-(void)setHasAudioTrack:(BOOL)hasAudioTrack audioSettings:(NSDictionary*)audioOutputSettings;// 開始、結(jié)束茫死、取消錄制-(void)startRecording;-(void)startRecordingInOrientation:(CGAffineTransform)orientationTransform;-(void)finishRecording;-(void)finishRecordingWithCompletionHandler:(void(^)(void))handler;-(void)cancelRecording;// 處理音頻-(void)processAudioBuffer:(CMSampleBufferRef)audioBuffer;// 處理同步videoInputReadyCallback跪但、audioInputReadyCallback回調(diào)-(void)enableSynchronizationCallbacks;

GPUImageMovieWriter 方法不是很多,但是方法都比較長,內(nèi)部處理也相對比較復(fù)雜屡久。這里只給出了常見的方法忆首。如果需要錄制視頻,可以仔細(xì)閱讀GPUImageMovieWriter的源碼被环。

// 初始話音頻參數(shù)糙及,如編碼格式、聲道數(shù)筛欢、采樣率浸锨、碼率-(void)setHasAudioTrack:(BOOL)newValue audioSettings:(NSDictionary*)audioOutputSettings;{_hasAudioTrack=newValue;if(_hasAudioTrack){if(_shouldPassthroughAudio){// Do not set any settings so audio will be the same as passthroughaudioOutputSettings=nil;}elseif(audioOutputSettings==nil){AVAudioSession*sharedAudioSession=[AVAudioSession sharedInstance];doublepreferredHardwareSampleRate;if([sharedAudioSession respondsToSelector:@selector(sampleRate)]){preferredHardwareSampleRate=[sharedAudioSession sampleRate];}else{#pragmaclang diagnostic push#pragmaclang diagnostic ignored "-Wdeprecated-declarations"preferredHardwareSampleRate=[[AVAudioSession sharedInstance]currentHardwareSampleRate];#pragmaclang diagnostic pop}AudioChannelLayout acl;bzero(&acl,sizeof(acl));acl.mChannelLayoutTag=kAudioChannelLayoutTag_Mono;audioOutputSettings=[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kAudioFormatMPEG4AAC],AVFormatIDKey,[NSNumber numberWithInt:1],AVNumberOfChannelsKey,[NSNumber numberWithFloat:preferredHardwareSampleRate],AVSampleRateKey,[NSData dataWithBytes:&acl length:sizeof(acl)],AVChannelLayoutKey,//[ NSNumber numberWithInt:AVAudioQualityLow], AVEncoderAudioQualityKey,[NSNumber numberWithInt:64000],AVEncoderBitRateKey,nil];/*

? ? ? ? ? ? AudioChannelLayout acl;

? ? ? ? ? ? bzero( &acl, sizeof(acl));

? ? ? ? ? ? acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;


? ? ? ? ? ? audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? [ NSNumber numberWithInt: kAudioFormatMPEG4AAC ], AVFormatIDKey,

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? [ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? nil];*/}assetWriterAudioInput=[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioOutputSettings];[assetWriter addInput:assetWriterAudioInput];assetWriterAudioInput.expectsMediaDataInRealTime=_encodingLiveVideo;}else{// Remove audio track if it exists}}-(void)finishRecordingWithCompletionHandler:(void(^)(void))handler;{runSynchronouslyOnContextQueue(_movieWriterContext,^{isRecording=NO;if(assetWriter.status==AVAssetWriterStatusCompleted||assetWriter.status==AVAssetWriterStatusCancelled||assetWriter.status==AVAssetWriterStatusUnknown){if(handler)runAsynchronouslyOnContextQueue(_movieWriterContext,handler);return;}if(assetWriter.status==AVAssetWriterStatusWriting&&!videoEncodingIsFinished){videoEncodingIsFinished=YES;[assetWriterVideoInput markAsFinished];}if(assetWriter.status==AVAssetWriterStatusWriting&&!audioEncodingIsFinished){audioEncodingIsFinished=YES;[assetWriterAudioInput markAsFinished];}#if(!defined(__IPHONE_6_0) || (__IPHONE_OS_VERSION_MAX_ALLOWED < __IPHONE_6_0))// Not iOS 6 SDK[assetWriter finishWriting];if(handler)runAsynchronouslyOnContextQueue(_movieWriterContext,handler);#else// iOS 6 SDKif([assetWriter respondsToSelector:@selector(finishWritingWithCompletionHandler:)]){// Running iOS 6[assetWriter finishWritingWithCompletionHandler:(handler?:^{})];}else{// Not running iOS 6#pragmaclang diagnostic push#pragmaclang diagnostic ignored "-Wdeprecated-declarations"[assetWriter finishWriting];#pragmaclang diagnostic popif(handler)runAsynchronouslyOnContextQueue(_movieWriterContext,handler);}#endif});}// 處理音頻數(shù)據(jù)-(void)processAudioBuffer:(CMSampleBufferRef)audioBuffer;{if(!isRecording||_paused){return;}//? ? if (_hasAudioTrack && CMTIME_IS_VALID(startTime))// 有音頻數(shù)據(jù)才處理if(_hasAudioTrack){CFRetain(audioBuffer);// 獲取音頻的ptsCMTime currentSampleTime=CMSampleBufferGetOutputPresentationTimeStamp(audioBuffer);if(CMTIME_IS_INVALID(startTime)){runSynchronouslyOnContextQueue(_movieWriterContext,^{// 判斷assetWriter的狀態(tài),不是寫入狀態(tài)則開始寫if((audioInputReadyCallback==NULL)&&(assetWriter.status!=AVAssetWriterStatusWriting)){[assetWriter startWriting];}// 設(shè)置pts[assetWriter startSessionAtSourceTime:currentSampleTime];startTime=currentSampleTime;});}// 判斷需不需要再使用audioBuffer版姑,如果不需要使用則設(shè)置為invalidate狀態(tài)if(!assetWriterAudioInput.readyForMoreMediaData&&_encodingLiveVideo){NSLog(@"1: Had to drop an audio frame: %@",CFBridgingRelease(CMTimeCopyDescription(kCFAllocatorDefault,currentSampleTime)));if(_shouldInvalidateAudioSampleWhenDone){CMSampleBufferInvalidate(audioBuffer);}CFRelease(audioBuffer);return;}if(discont){discont=NO;CMTime current;if(offsetTime.value>0){current=CMTimeSubtract(currentSampleTime,offsetTime);}else{current=currentSampleTime;}CMTime offset=CMTimeSubtract(current,previousAudioTime);if(offsetTime.value==0){offsetTime=offset;}else{offsetTime=CMTimeAdd(offsetTime,offset);}}if(offsetTime.value>0){CFRelease(audioBuffer);audioBuffer=[selfadjustTime:audioBuffer by:offsetTime];CFRetain(audioBuffer);}// record most recent time so we know the length of the pausecurrentSampleTime=CMSampleBufferGetPresentationTimeStamp(audioBuffer);previousAudioTime=currentSampleTime;//if the consumer wants to do something with the audio samples before writing, let him.// 如果處理音頻需要回調(diào)柱搜,則回調(diào)if(self.audioProcessingCallback){//need to introspect into the opaque CMBlockBuffer structure to find its raw sample buffers.CMBlockBufferRef buffer=CMSampleBufferGetDataBuffer(audioBuffer);CMItemCount numSamplesInBuffer=CMSampleBufferGetNumSamples(audioBuffer);AudioBufferList audioBufferList;CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(audioBuffer,NULL,&audioBufferList,sizeof(audioBufferList),NULL,NULL,kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,&buffer);//passing a live pointer to the audio buffers, try to process them in-place or we might have syncing issues.for(intbufferCount=0;bufferCount<audioBufferList.mNumberBuffers;bufferCount++){SInt16*samples=(SInt16*)audioBufferList.mBuffers[bufferCount].mData;self.audioProcessingCallback(&samples,numSamplesInBuffer);}}//? ? ? ? NSLog(@"Recorded audio sample time: %lld, %d, %lld", currentSampleTime.value, currentSampleTime.timescale, currentSampleTime.epoch);// 寫入音頻blockvoid(^write)()=^(){// 如不能append 則等待while(!assetWriterAudioInput.readyForMoreMediaData&&!_encodingLiveVideo&&!audioEncodingIsFinished){NSDate*maxDate=[NSDate dateWithTimeIntervalSinceNow:0.5];//NSLog(@"audio waiting...");[[NSRunLoop currentRunLoop]runUntilDate:maxDate];}if(!assetWriterAudioInput.readyForMoreMediaData){NSLog(@"2: Had to drop an audio frame %@",CFBridgingRelease(CMTimeCopyDescription(kCFAllocatorDefault,currentSampleTime)));}// 當(dāng)readyForMoreMediaData為YES的時候才可以追加數(shù)據(jù)elseif(assetWriter.status==AVAssetWriterStatusWriting){if(![assetWriterAudioInput appendSampleBuffer:audioBuffer])NSLog(@"Problem appending audio buffer at time: %@",CFBridgingRelease(CMTimeCopyDescription(kCFAllocatorDefault,currentSampleTime)));}else{//NSLog(@"Wrote an audio frame %@", CFBridgingRelease(CMTimeCopyDescription(kCFAllocatorDefault, currentSampleTime)));}// 標(biāo)記不被使用if(_shouldInvalidateAudioSampleWhenDone){CMSampleBufferInvalidate(audioBuffer);}CFRelease(audioBuffer);};//? ? ? ? runAsynchronouslyOnContextQueue(_movieWriterContext, write);// 如果需要編碼視頻,則派發(fā)到相同隊(duì)列中執(zhí)行if(_encodingLiveVideo){runAsynchronouslyOnContextQueue(_movieWriterContext,write);}else{// 否則剥险,直接寫入write();}}}

GPUImageMovie

GPUImageMovie 主要的作用是讀取與解碼音視頻文件聪蘸。它繼承自GPUImageOutput,可以輸出幀緩存對象表制,由于沒有實(shí)現(xiàn)GPUImageInput協(xié)議健爬,因此只能作為響應(yīng)源。

初始化么介』肜停可以通過NSURL、AVPlayerItem夭拌、AVAsset初始化。

-(id)initWithAsset:(AVAsset*)asset;-(id)initWithPlayerItem:(AVPlayerItem*)playerItem;-(id)initWithURL:(NSURL*)url;

初始化相對簡單衷咽,只是簡單保存?zhèn)魅霐?shù)據(jù)鸽扁。

-(id)initWithURL:(NSURL*)url;{if(!(self=[superinit])){returnnil;}[selfyuvConversionSetup];self.url=url;self.asset=nil;returnself;}-(id)initWithAsset:(AVAsset*)asset;{if(!(self=[superinit])){returnnil;}[selfyuvConversionSetup];self.url=nil;self.asset=asset;returnself;}-(id)initWithPlayerItem:(AVPlayerItem*)playerItem;{if(!(self=[superinit])){returnnil;}[selfyuvConversionSetup];self.url=nil;self.asset=nil;self.playerItem=playerItem;returnself;}

其它方法。GPUImageMovie方法比較少镶骗,但是代碼比較多相對比較復(fù)雜桶现。由于篇幅有限,這里不再細(xì)看鼎姊。它的方法主要分為這幾類:1骡和、讀取音視頻數(shù)據(jù);2相寇、讀取的控制(開始慰于、暫停、取消)唤衫;3婆赠、處理音視頻數(shù)據(jù)幀。

// 允許使用GPUImageMovieWriter進(jìn)行音視頻同步編碼-(void)enableSynchronizedEncodingUsingMovieWriter:(GPUImageMovieWriter*)movieWriter;// 讀取音視頻-(BOOL)readNextVideoFrameFromOutput:(AVAssetReaderOutput*)readerVideoTrackOutput;-(BOOL)readNextAudioSampleFromOutput:(AVAssetReaderOutput*)readerAudioTrackOutput;// 開始佳励、結(jié)束休里、取消讀取-(void)startProcessing;-(void)endProcessing;-(void)cancelProcessing;// 處理視頻幀-(void)processMovieFrame:(CMSampleBufferRef)movieSampleBuffer;

實(shí)現(xiàn)過程

錄制視頻

#import"ViewController.h"#import#defineDOCUMENT(path) [[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject] stringByAppendingPathComponent:path]@interfaceViewController()@property(weak,nonatomic)IBOutlet GPUImageView*imageView;@property(strong,nonatomic)GPUImageVideoCamera*video;@property(strong,nonatomic)GPUImageMovieWriter*writer;@property(nonatomic,strong)NSURL*videoFile;@property(nonatomic,readonly,getter=isRecording)BOOL recording;@end@implementationViewController-(void)viewDidLoad{[superviewDidLoad];_recording=NO;// 設(shè)置背景色[_imageView setBackgroundColorRed:1.0green:1.0blue:1.0alpha:1.0];// 設(shè)置保存文件路徑_videoFile=[NSURL fileURLWithPath:DOCUMENT(@"/1.mov")];// 刪除文件[[NSFileManager defaultManager]removeItemAtURL:_videoFile error:nil];// 設(shè)置GPUImageMovieWriter_writer=[[GPUImageMovieWriter alloc]initWithMovieURL:_videoFile size:CGSizeMake(480,640)];[_writer setHasAudioTrack:YES audioSettings:nil];// 設(shè)置GPUImageVideoCamera_video=[[GPUImageVideoCamera alloc]initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];_video.outputImageOrientation=UIInterfaceOrientationPortrait;[_video addAudioInputsAndOutputs];// 設(shè)置音頻處理Target_video.audioEncodingTarget=_writer;// 設(shè)置Target[_video addTarget:_imageView];[_video addTarget:_writer];// 開始拍攝[_video startCameraCapture];}-(IBAction)startButtonTapped:(UIButton*)sender{if(!_recording){// 開始錄制視頻[_writer startRecording];_recording=YES;}}-(IBAction)finishButtonTapped:(UIButton*)sender{// 結(jié)束錄制[_writer finishRecording];}@end

拍照

#import"SecondViewController.h"#import"ImageShowViewController.h"#import@interfaceSecondViewController()@property(weak,nonatomic)IBOutlet GPUImageView*imageView;@property(nonatomic,strong)GPUImageStillCamera*camera;@property(nonatomic,strong)GPUImageFilter*filter;@end@implementationSecondViewController-(void)viewDidLoad{[superviewDidLoad];// 設(shè)置背景色[_imageView setBackgroundColorRed:1.0green:1.0blue:1.0alpha:1.0];// 濾鏡_filter=[[GPUImageGrayscaleFilter alloc]init];// 初始化_camera=[[GPUImageStillCamera alloc]initWithSessionPreset:AVCaptureSessionPresetPhoto cameraPosition:AVCaptureDevicePositionBack];_camera.outputImageOrientation=UIInterfaceOrientationPortrait;[_camera addTarget:_filter];[_filter addTarget:_imageView];// 開始運(yùn)行[_camera startCameraCapture];}-(IBAction)pictureButtonTapped:(UIButton*)sender{if([_camera isRunning]){[_camera capturePhotoAsImageProcessedUpToFilter:_filter withCompletionHandler:^(UIImage*processedImage,NSError*error){[_camera stopCameraCapture];ImageShowViewController*imageShowVC=[[UIStoryboard storyboardWithName:@"Main"bundle:nil]instantiateViewControllerWithIdentifier:@"ImageShowViewController"];imageShowVC.image=processedImage;[selfpresentViewController:imageShowVC animated:YES completion:NULL];}];}else{[_camera startCameraCapture];}}

視頻轉(zhuǎn)碼與濾鏡蛆挫。由于需要將aac解碼為pcm,因此我修改了GPUImageMovie.m源碼的230行(?為了解決這個報錯?[AVAssetWriterInput appendSampleBuffer:] Cannot append sample buffer: Input buffer must be in an uncompressed format when outputSettings is not nil)妙黍,如下所示:

NSDictionary*audioOutputSetting=@{AVFormatIDKey:@(kAudioFormatLinearPCM)};// This might need to be extended to handle movies with more than one audio trackAVAssetTrack*audioTrack=[audioTracks objectAtIndex:0];readerAudioTrackOutput=[AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:audioOutputSetting];

#import"ThirdViewController.h"#import#defineDOCUMENT(path) [[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject] stringByAppendingPathComponent:path]@interfaceThirdViewController()@property(weak,nonatomic)IBOutlet GPUImageView*imageView;@property(nonatomic,strong)GPUImageMovie*movie;@property(nonatomic,strong)GPUImageMovieWriter*movieWriter;@property(nonatomic,strong)GPUImageFilter*filter;@property(nonatomic,assign)CGSize size;@end@implementationThirdViewController-(void)viewDidLoad{[superviewDidLoad];// 獲取文件路徑NSURL*fileURL=[[NSBundle mainBundle]URLForResource:@"1.mp4"withExtension:nil];AVAsset*asset=[AVAsset assetWithURL:fileURL];// 獲取視頻寬高NSArray*tracks=[asset tracksWithMediaType:AVMediaTypeVideo];AVAssetTrack*videoTrack=[tracks firstObject];_size=videoTrack.naturalSize;// 初始化GPUImageMovie_movie=[[GPUImageMovie alloc]initWithAsset:asset];// 濾鏡_filter=[[GPUImageGrayscaleFilter alloc]init];[_movie addTarget:_filter];[_filter addTarget:_imageView];}-(IBAction)playButtonTapped:(UIButton*)sender{[_movie startProcessing];}-(IBAction)transcodeButtonTapped:(id)sender{// 文件路徑NSURL*videoFile=[NSURL fileURLWithPath:DOCUMENT(@"/2.mov")];[[NSFileManager defaultManager]removeItemAtURL:videoFile error:nil];// GPUImageMovieWriter_movieWriter=[[GPUImageMovieWriter alloc]initWithMovieURL:videoFile size:_size];[_movieWriter setHasAudioTrack:YES audioSettings:nil];// GPUImageMovie相關(guān)設(shè)置_movie.audioEncodingTarget=_movieWriter;[_filter addTarget:_movieWriter];[_movie enableSynchronizedEncodingUsingMovieWriter:_movieWriter];// 開始轉(zhuǎn)碼[_movieWriter startRecording];[_movie startProcessing];// 結(jié)束__weaktypeof(_movieWriter)wMovieWriter=_movieWriter;__weaktypeof(self)wSelf=self;[_movieWriter setCompletionBlock:^{[wMovieWriter finishRecording];[wSelf.movie removeTarget:wMovieWriter];wSelf.movie.audioEncodingTarget=nil;}];}

總結(jié)

GPUImageVideoCamera悴侵、GPUImageStillCamera、GPUImageMovieWriter拭嫁、GPUImageMovie 這幾個類在處理相機(jī)可免、音視頻的時候非常有用,由于篇幅限制噩凹,不能全部講解它們的源碼巴元。如果有需要可以自己好好閱讀。

源碼地址:GPUImage源碼閱讀系列 https://github.com/QinminiOS/GPUImage

系列文章地址:GPUImage源碼閱讀 http://www.reibang.com/nb/11749791

?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末驮宴,一起剝皮案震驚了整個濱河市岭参,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌饱亿,老刑警劉巖赴蝇,帶你破解...
    沈念sama閱讀 222,000評論 6 515
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場離奇詭異迎罗,居然都是意外死亡睬愤,警方通過查閱死者的電腦和手機(jī),發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 94,745評論 3 399
  • 文/潘曉璐 我一進(jìn)店門纹安,熙熙樓的掌柜王于貴愁眉苦臉地迎上來尤辱,“玉大人,你說我怎么就攤上這事厢岂」舛剑” “怎么了?”我有些...
    開封第一講書人閱讀 168,561評論 0 360
  • 文/不壞的土叔 我叫張陵塔粒,是天一觀的道長结借。 經(jīng)常有香客問我,道長卒茬,這世上最難降的妖魔是什么船老? 我笑而不...
    開封第一講書人閱讀 59,782評論 1 298
  • 正文 為了忘掉前任,我火速辦了婚禮圃酵,結(jié)果婚禮上柳畔,老公的妹妹穿的比我還像新娘。我一直安慰自己郭赐,他們只是感情好荸镊,可當(dāng)我...
    茶點(diǎn)故事閱讀 68,798評論 6 397
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著,像睡著了一般躬存。 火紅的嫁衣襯著肌膚如雪张惹。 梳的紋絲不亂的頭發(fā)上,一...
    開封第一講書人閱讀 52,394評論 1 310
  • 那天岭洲,我揣著相機(jī)與錄音宛逗,去河邊找鬼。 笑死盾剩,一個胖子當(dāng)著我的面吹牛雷激,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播告私,決...
    沈念sama閱讀 40,952評論 3 421
  • 文/蒼蘭香墨 我猛地睜開眼屎暇,長吁一口氣:“原來是場噩夢啊……” “哼!你這毒婦竟也來了驻粟?” 一聲冷哼從身側(cè)響起根悼,我...
    開封第一講書人閱讀 39,852評論 0 276
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎蜀撑,沒想到半個月后挤巡,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 46,409評論 1 318
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡酷麦,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 38,483評論 3 341
  • 正文 我和宋清朗相戀三年矿卑,在試婚紗的時候發(fā)現(xiàn)自己被綠了。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片沃饶。...
    茶點(diǎn)故事閱讀 40,615評論 1 352
  • 序言:一個原本活蹦亂跳的男人離奇死亡母廷,死狀恐怖,靈堂內(nèi)的尸體忽然破棺而出糊肤,到底是詐尸還是另有隱情徘意,我是刑警寧澤,帶...
    沈念sama閱讀 36,303評論 5 350
  • 正文 年R本政府宣布轩褐,位于F島的核電站,受9級特大地震影響玖详,放射性物質(zhì)發(fā)生泄漏把介。R本人自食惡果不足惜,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 41,979評論 3 334
  • 文/蒙蒙 一蟋座、第九天 我趴在偏房一處隱蔽的房頂上張望拗踢。 院中可真熱鬧,春花似錦向臀、人聲如沸巢墅。這莊子的主人今日做“春日...
    開封第一講書人閱讀 32,470評論 0 24
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽君纫。三九已至驯遇,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間蓄髓,已是汗流浹背叉庐。 一陣腳步聲響...
    開封第一講書人閱讀 33,571評論 1 272
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留会喝,地道東北人陡叠。 一個月前我還...
    沈念sama閱讀 49,041評論 3 377
  • 正文 我出身青樓,卻偏偏與公主長得像肢执,于是被迫代替她去往敵國和親枉阵。 傳聞我的和親對象是個殘疾皇子,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 45,630評論 2 359

推薦閱讀更多精彩內(nèi)容