版本記錄
版本號 | 時間 |
---|---|
V1.0 | 2017.09.02 |
前言
GPUImage
是直接利用顯卡實現(xiàn)視頻或者圖像處理的技術(shù)。感興趣可以看上面幾篇文章弥咪。
1. GPUImage解析(一) —— 基本概覽(一)
2. GPUImage解析(二) —— 基本概覽(二)
3. GPUImage解析(三) —— 基本概覽(三)
4. GPUImage解析(四) —— 安裝方法及框架介紹
框架中的幾個基類
該框架其實可以分為兩個部分肋坚,一部分就是基類百新,另外一部分就是濾鏡昭抒,這一篇我們就說一下這個框架的基類部分危彩。
// Base classes
#import "GPUImageContext.h"
#import "GPUImageOutput.h"
#import "GPUImageView.h"
#import "GPUImageVideoCamera.h"
#import "GPUImageStillCamera.h"
#import "GPUImageMovie.h"
#import "GPUImagePicture.h"
#import "GPUImageRawDataInput.h"
#import "GPUImageRawDataOutput.h"
#import "GPUImageMovieWriter.h"
#import "GPUImageFilterPipeline.h"
#import "GPUImageTextureOutput.h"
#import "GPUImageFilterGroup.h"
#import "GPUImageTextureInput.h"
#import "GPUImageUIElement.h"
#import "GPUImageBuffer.h"
#import "GPUImageFramebuffer.h"
#import "GPUImageFramebufferCache.h"
基類詳細(xì)分析
下面我們就詳細(xì)的分析下這幾個基類徒欣。
1. GPUImageContext
- 繼承與屬性
@interface GPUImageContext : NSObject
@property(readonly, nonatomic) dispatch_queue_t contextQueue;
@property(readwrite, retain, nonatomic) GLProgram *currentShaderProgram;
@property(readonly, retain, nonatomic) EAGLContext *context;
@property(readonly) CVOpenGLESTextureCacheRef coreVideoTextureCache;
@property(readonly) GPUImageFramebufferCache *framebufferCache;
- 作用:
GPUImageContext
是GPUImage對OpenGL ES
上下文的封裝逐样,添加了GPUImage相關(guān)的上下文,比如說Program的使用緩存打肝,處理隊列脂新,CV紋理緩存等。
幾個屬性
-
contextQueue
統(tǒng)一處理隊列 -
currentShaderProgram
正在使用的program -
context OpenGL ES
的上下文 -
coreVideoTextureCache
CV紋理緩存 -
framebufferCache GPUImageBuffer
緩存 -
shaderProgramCache Program
的緩存 -
shaderProgramUsageHistory Program
的使用歷史
幾個方法
useAsCurrentContext()
在useAsCurrentContext設(shè)置當(dāng)前上下文的時候闯睹,會先判斷上下文是否是當(dāng)前context戏羽,不是再設(shè)置(為了避免上下文切換的性能消耗,即使設(shè)置的上下文是同一個上下文也會消耗性能)sizeThatFitsWithinATextureForSize()
會調(diào)整紋理大小楼吃,如果超過最大的紋理始花,會調(diào)整為不超過最大的紋理寬高。(GLProgram*)programForVertexShaderString:fragmentShaderString:
shaderProgramCache 是program的緩存孩锡,由頂點shader和片元shader字符串拼接起來做key酷宵。(void)useSharegroup:(EAGLSharegroup *)sharegroup;
EAGLSharegroup類管理一個或者多個EAGLContext的OpenGLES資源;這個是一個封閉的類躬窜,沒有開發(fā)者API浇垦。負(fù)責(zé)管理紋理緩存、頂點緩存荣挨、幀緩存男韧、顏色緩存。(textures, buffers, framebuffers, and render buffers)(EAGLContext *)context;
返回OpenGL ES2.0的上下文默垄,同時設(shè)置glDisable(GL_DEPTH_TEST);此虑,圖像處理管道默認(rèn)不允許使用深度緩存。
2. GPUImageOutput
- 繼承與屬性
/** GPUImage's base source object
Images or frames of video are uploaded from source objects, which are subclasses of GPUImageOutput. These include:
- GPUImageVideoCamera (for live video from an iOS camera)
- GPUImageStillCamera (for taking photos with the camera)
- GPUImagePicture (for still images)
- GPUImageMovie (for movies)
Source objects upload still image frames to OpenGL ES as textures, then hand those textures off to the next objects in the processing chain.
*/
@interface GPUImageOutput : NSObject
{
GPUImageFramebuffer *outputFramebuffer;
NSMutableArray *targets, *targetTextureIndices;
CGSize inputTextureSize, cachedMaximumOutputSize, forcedMaximumSize;
BOOL overrideInputSize;
BOOL allTargetsWantMonochromeData;
BOOL usingNextFrameForImageCapture;
}
- 作用:
GPUImageOutput
類將靜態(tài)圖像紋理上傳到OpenGL ES
中口锭,然后使用這些紋理去處理進程鏈中的下一個對象朦前。它的子類可以獲得濾鏡處理后的圖片功能。
3. GPUImageView
- 繼承與屬性
/**
UIView subclass to use as an endpoint for displaying GPUImage outputs
*/
@interface GPUImageView : UIView <GPUImageInput>
{
GPUImageRotationMode inputRotation;
}
/** The fill mode dictates how images are fit in the view, with the default being kGPUImageFillModePreserveAspectRatio
*/
@property(readwrite, nonatomic) GPUImageFillModeType fillMode;
/** This calculates the current display size, in pixels, taking into account Retina scaling factors
*/
@property(readonly, nonatomic) CGSize sizeInPixels;
@property(nonatomic) BOOL enabled;
/** Handling fill mode
@param redComponent Red component for background color
@param greenComponent Green component for background color
@param blueComponent Blue component for background color
@param alphaComponent Alpha component for background color
*/
- (void)setBackgroundColorRed:(GLfloat)redComponent green:(GLfloat)greenComponent blue:(GLfloat)blueComponent alpha:(GLfloat)alphaComponent;
- (void)setCurrentlyReceivingMonochromeInput:(BOOL)newValue;
@end
- 圖像視圖
4. GPUImageVideoCamera
- 繼承與屬性
/**
A GPUImageOutput that provides frames from either camera
*/
@interface GPUImageVideoCamera : GPUImageOutput <AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate>
{
NSUInteger numberOfFramesCaptured;
CGFloat totalFrameTimeDuringCapture;
AVCaptureSession *_captureSession;
AVCaptureDevice *_inputCamera;
AVCaptureDevice *_microphone;
AVCaptureDeviceInput *videoInput;
AVCaptureVideoDataOutput *videoOutput;
BOOL capturePaused;
GPUImageRotationMode outputRotation, internalRotation;
dispatch_semaphore_t frameRenderingSemaphore;
BOOL captureAsYUV;
GLuint luminanceTexture, chrominanceTexture;
__unsafe_unretained id<GPUImageVideoCameraDelegate> _delegate;
}
/// The AVCaptureSession used to capture from the camera
@property(readonly, retain, nonatomic) AVCaptureSession *captureSession;
/// This enables the capture session preset to be changed on the fly
@property (readwrite, nonatomic, copy) NSString *captureSessionPreset;
/// This sets the frame rate of the camera (iOS 5 and above only)
/**
Setting this to 0 or below will set the frame rate back to the default setting for a particular preset.
*/
@property (readwrite) int32_t frameRate;
/// Easy way to tell which cameras are present on device
@property (readonly, getter = isFrontFacingCameraPresent) BOOL frontFacingCameraPresent;
@property (readonly, getter = isBackFacingCameraPresent) BOOL backFacingCameraPresent;
/// This enables the benchmarking mode, which logs out instantaneous and average frame times to the console
@property(readwrite, nonatomic) BOOL runBenchmark;
/// Use this property to manage camera settings. Focus point, exposure point, etc.
@property(readonly) AVCaptureDevice *inputCamera;
/// This determines the rotation applied to the output image, based on the source material
@property(readwrite, nonatomic) UIInterfaceOrientation outputImageOrientation;
/// These properties determine whether or not the two camera orientations should be mirrored. By default, both are NO.
@property(readwrite, nonatomic) BOOL horizontallyMirrorFrontFacingCamera, horizontallyMirrorRearFacingCamera;
@property(nonatomic, assign) id<GPUImageVideoCameraDelegate> delegate;
- 作用:
GPUImageVideoCamera
是GPUImageOutput
的子類鹃操,提供來自攝像頭的圖像數(shù)據(jù)作為源數(shù)據(jù)韭寸,一般是響應(yīng)鏈的源頭。GPUImage使用AVFoundation框架來獲取視頻荆隘。
AVCaptureSession類從AV輸入設(shè)備的采集數(shù)據(jù)到制定的輸出恩伺。為了實現(xiàn)實時的圖像捕獲,要實現(xiàn)AVCaptureSession類臭胜,添加合適的輸入(AVCaptureDeviceInput)和輸出(比如 AVCaptureMovieFileOutput)調(diào)用startRunning開始輸入到輸出的數(shù)據(jù)流莫其,調(diào)用stopRunning停止數(shù)據(jù)流癞尚。需要注意的是startRunning函數(shù)會花費一定的時間,所以不能在主線程(UI線程)調(diào)用乱陡,防止卡頓浇揩。
5. GPUImageStillCamera
- 繼承與屬性
@interface GPUImageStillCamera : GPUImageVideoCamera
/** The JPEG compression quality to use when capturing a photo as a JPEG.
*/
@property CGFloat jpegCompressionQuality;
// Only reliably set inside the context of the completion handler of one of the capture methods
@property (readonly) NSDictionary *currentCaptureMetadata;
- 作用:GPUImage中
GPUImageStillCamera
可以調(diào)用系統(tǒng)相機,并實現(xiàn)實時濾鏡憨颠,GPUImageStillCamera
繼承自GPUImageVideoCamera
類胳徽,添加了捕獲照片的功能。
使用步驟
- 創(chuàng)建預(yù)覽View 即必須的
GPUImageView
- 創(chuàng)建濾鏡
- 創(chuàng)建Camera 即我們要用到的
GPUImageStillCamera
- addTarget 并開始處理startCameraCapture
- 回調(diào)數(shù)據(jù)爽彤、寫入相冊
6. GPUImageMovie
- 繼承與屬性
/** Source object for filtering movies
*/
@interface GPUImageMovie : GPUImageOutput
@property (readwrite, retain) AVAsset *asset;
@property (readwrite, retain) AVPlayerItem *playerItem;
@property(readwrite, retain) NSURL *url;
/** This enables the benchmarking mode, which logs out instantaneous and average frame times to the console
*/
@property(readwrite, nonatomic) BOOL runBenchmark;
/** This determines whether to play back a movie as fast as the frames can be processed, or if the original speed of the movie should be respected. Defaults to NO.
*/
@property(readwrite, nonatomic) BOOL playAtActualSpeed;
/** This determines whether the video should repeat (loop) at the end and restart from the beginning. Defaults to NO.
*/
@property(readwrite, nonatomic) BOOL shouldRepeat;
/** This specifies the progress of the process on a scale from 0 to 1.0. A value of 0 means the process has not yet begun, A value of 1.0 means the conversaion is complete.
This property is not key-value observable.
*/
@property(readonly, nonatomic) float progress;
/** This is used to send the delete Movie did complete playing alert
*/
@property (readwrite, nonatomic, assign) id <GPUImageMovieDelegate>delegate;
@property (readonly, nonatomic) AVAssetReader *assetReader;
@property (readonly, nonatomic) BOOL audioEncodingIsFinished;
@property (readonly, nonatomic) BOOL videoEncodingIsFinished;
- 作用:GPUImageMovie類繼承了
GPUImageOutput
類养盗,一般作為響應(yīng)鏈的源頭,可以通過url适篙、playerItem往核、asset初始化。
7. GPUImagePicture
- 繼承與屬性
@interface GPUImagePicture : GPUImageOutput
{
CGSize pixelSizeOfImage;
BOOL hasProcessedImage;
dispatch_semaphore_t imageUpdateSemaphore;
}
- 作用:
GPUImagePicture
是PGUImage的圖像處理類嚷节,繼承GPUImageOutput
聂儒,一般作為響應(yīng)鏈的源頭。
幾個屬性
-
pixelSizeOfImage
圖像的像素大小硫痰。 -
hasProcessedImage
圖像是否已處理衩婚。 -
imageUpdateSemaphore
圖像處理的GCD信號量。
幾個方法
-
- (id)initWithCGImage:smoothlyScaleOutput:
用源圖像newImageSource
和是否采用mipmaps來初始化GPUImagePicture
效斑。
如果圖像大小超過OpenGL ES最大紋理寬高非春,或者使用mipmaps,或者圖像數(shù)據(jù)是浮點型缓屠、顏色空間不對等都會采用CoreGraphics重新繪制圖像奇昙。
然后通過glTexImage2D把圖像數(shù)據(jù)發(fā)送給GPU,最后釋放掉CPU的圖像數(shù)據(jù)敌完。 -
- (BOOL)processImageWithCompletionHandler:;
通知targets處理圖像敬矩,并在完成后調(diào)用complete代碼塊。在處理開始時蠢挡,會標(biāo)記hasProcessedImage為YES,并調(diào)用dispatch_semaphore_wait()
凳忙,確定上次處理已經(jīng)完成业踏,否則取消這次處理。 -
- (void)addTarget: atTextureLocation:;
添加target到響應(yīng)鏈涧卵。如果hasProcessedImage為YES勤家,表示圖像已經(jīng)處理完畢,直接設(shè)置targets的InputSize柳恐,并調(diào)用newFrameReadyAtTime()
通知target伐脖。
8. GPUImageRawDataInput
- 繼承與屬性
@interface GPUImageRawDataInput : GPUImageOutput
{
CGSize uploadedImageSize;
dispatch_semaphore_t dataUpdateSemaphore;
}
-
作用:
GPUImageRawDataInput
類繼承GPUImageOutput
類热幔,可以接受二進制數(shù)據(jù),按照特定的顏色格式讼庇,把數(shù)據(jù)轉(zhuǎn)成圖像并傳入響應(yīng)鏈绎巨;GPUImageRawDataInput
不會對傳入的數(shù)據(jù)copied或者retained,但你不需要在使用完之后去釋放蠕啄;二進制數(shù)據(jù)發(fā)送到GPU的紋理單元场勤,默認(rèn)的顏色格式是BGRA和整型數(shù)據(jù)。- 上傳圖片的邏輯:先申請
outputframebuffer
歼跟,然后綁定紋理和媳,最后用glTexImage2D 上傳圖像數(shù)據(jù)到GPU。 -
processData
方法:處理圖片哈街;如果上一次操作還未完成留瞳,則直接返回。
- 上傳圖片的邏輯:先申請
9. GPUImageRawDataOutput
- 繼承與屬性
#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE
@interface GPUImageRawDataOutput : NSObject <GPUImageInput> {
CGSize imageSize;
GPUImageRotationMode inputRotation;
BOOL outputBGRA;
}
#else
@interface GPUImageRawDataOutput : NSObject <GPUImageInput> {
CGSize imageSize;
GPUImageRotationMode inputRotation;
BOOL outputBGRA;
}
#endif
@property(readonly) GLubyte *rawBytesForImage;
@property(nonatomic, copy) void(^newFrameAvailableBlock)(void);
@property(nonatomic) BOOL enabled;
-
作用:
GPUImageRawDataOutput
類實現(xiàn)協(xié)議GPUImageInput骚秦,可以接受響應(yīng)鏈的圖像信息她倘,并且以二進制的格式返回數(shù)據(jù);-
rawBytesForImage
屬性: 二進制數(shù)據(jù)的指針骤竹; -
GPUByteColorVector
結(jié)構(gòu)體:RGBA顏色空間結(jié)構(gòu)體帝牡,便于讀取二進制數(shù)據(jù); -
supportsFastTextureUpload
用的BGRA的顏色格式蒙揣;
如果需要輸出RGBA靶溜,則可以對BGRA格式再做一次RGBA->BRGA的顏色轉(zhuǎn)換;RGBA -> BGRA 的操作如下:
texture2D(inputImageTexture, textureCoordinate).bgra;
-
lockNextFramebuffer
屬性:標(biāo)志是否要讀取圖像信息如果為YES懒震,會調(diào)用CVPixelBufferLockBaseAddress
鎖住對應(yīng)的CVPixelBufferRef
罩息;
-
10. GPUImageMovieWriter
- 繼承與屬性
@interface GPUImageMovieWriter : NSObject <GPUImageInput>
{
BOOL alreadyFinishedRecording;
NSURL *movieURL;
NSString *fileType;
AVAssetWriter *assetWriter;
AVAssetWriterInput *assetWriterAudioInput;
AVAssetWriterInput *assetWriterVideoInput;
AVAssetWriterInputPixelBufferAdaptor *assetWriterPixelBufferInput;
GPUImageContext *_movieWriterContext;
CVPixelBufferRef renderTarget;
CVOpenGLESTextureRef renderTexture;
CGSize videoSize;
GPUImageRotationMode inputRotation;
}
@property(readwrite, nonatomic) BOOL hasAudioTrack;
@property(readwrite, nonatomic) BOOL shouldPassthroughAudio;
@property(readwrite, nonatomic) BOOL shouldInvalidateAudioSampleWhenDone;
@property(nonatomic, copy) void(^completionBlock)(void);
@property(nonatomic, copy) void(^failureBlock)(NSError*);
@property(nonatomic, assign) id<GPUImageMovieWriterDelegate> delegate;
@property(readwrite, nonatomic) BOOL encodingLiveVideo;
@property(nonatomic, copy) BOOL(^videoInputReadyCallback)(void);
@property(nonatomic, copy) BOOL(^audioInputReadyCallback)(void);
@property(nonatomic, copy) void(^audioProcessingCallback)(SInt16 **samplesRef, CMItemCount numSamplesInBuffer);
@property(nonatomic) BOOL enabled;
@property(nonatomic, readonly) AVAssetWriter *assetWriter;
@property(nonatomic, readonly) CMTime duration;
@property(nonatomic, assign) CGAffineTransform transform;
@property(nonatomic, copy) NSArray *metaData;
@property(nonatomic, assign, getter = isPaused) BOOL paused;
@property(nonatomic, retain) GPUImageContext *movieWriterContext;
- 作用:GPUImageMovieWriter類實現(xiàn)GPUImageInput協(xié)議,一般作為響應(yīng)鏈的終點个扰。
shouldPassthroughAudio
表示是否使用源音源瓷炮。
movieFile.audioEncodingTarget = movieWriter;
表示音頻來源是文件
11. GPUImageFilterPipeline
- 繼承與屬性
@interface GPUImageFilterPipeline : NSObject
{
NSString *stringValue;
}
@property (strong) NSMutableArray *filters;
@property (strong) GPUImageOutput *input;
@property (strong) id <GPUImageInput> output;
-
作用:
GPUImageFilterPipeline
類是濾鏡通道,把inputs的濾鏡組合起來递宅,然后添加output為最后的輸出目標(biāo)娘香。-
filters
為輸入的濾鏡,output
為輸出目標(biāo)办龄; - 把filters的濾鏡按照鏈表的形式串聯(lián)起來烘绽。
-
12. GPUImageTextureOutput
- 繼承與屬性
@interface GPUImageTextureOutput : NSObject <GPUImageInput>
{
GPUImageFramebuffer *firstInputFramebuffer;
}
@property(readwrite, unsafe_unretained, nonatomic) id<GPUImageTextureOutputDelegate> delegate;
@property(readonly) GLuint texture;
@property(nonatomic) BOOL enabled;
作用:
GPUImageTextureOutput
類實現(xiàn)GPUImageInput
協(xié)議,可以接受響應(yīng)鏈的圖像俐填,并返回對應(yīng)的OpenGL ES
紋理安接。delegate
屬性:實現(xiàn)了GPUImageTextureOutputDelegate
協(xié)議的回調(diào)對象;texture
屬性:OpenGL ES的紋理英融,只讀盏檐;enabled
屬性:是否有效歇式,默認(rèn)為有效;doneWithTexture
方法:結(jié)束處理紋理圖像胡野,解鎖firstInputFramebuffer
材失。
13. GPUImageFilterGroup
- 繼承與屬性
@interface GPUImageFilterGroup : GPUImageOutput <GPUImageInput>
{
NSMutableArray *filters;
BOOL isEndProcessing;
}
@property(readwrite, nonatomic, strong) GPUImageOutput<GPUImageInput> *terminalFilter;
@property(readwrite, nonatomic, strong) NSArray *initialFilters;
@property(readwrite, nonatomic, strong) GPUImageOutput<GPUImageInput> *inputFilterToIgnoreForUpdates;
- 作用:組合濾鏡,添加濾鏡的順序不同给涕,效果也不同豺憔。
14. GPUImageTextureInput
- 繼承與屬性
@interface GPUImageTextureInput : GPUImageOutput
{
CGSize textureSize;
}
// Initialization and teardown
- (id)initWithTexture:(GLuint)newInputTexture size:(CGSize)newTextureSize;
// Image rendering
- (void)processTextureWithFrameTime:(CMTime)frameTime;
@end
- 作用:
GPUImageTextureInput
類繼承GPUImageOutput
類,可以作為響應(yīng)鏈的起點够庙,把OpenGL ES紋理對應(yīng)的紋理信息導(dǎo)入響應(yīng)鏈處理恭应。textureSize
屬性為紋理尺寸;
初始化的時候耘眨,分配一個GPUImageFramebuffer
昼榛,緩存紋理單元的信息;process的時候直接調(diào)用targets對應(yīng)的就緒方法剔难,因為圖像信息就在OpenGL ES控制內(nèi)存中胆屿。
GPUImageTextureOutput
和 GPUImageTextureInput
用于 向OpenGL ES
輸入或者輸出紋理,把GPUImage的輸出作為OpenGL ES的紋理或者把OpenGL ES的輸出作為GPUImage的紋理輸入偶宫。
15. GPUImageUIElement
- 繼承與屬性
@interface GPUImageUIElement : GPUImageOutput
// Initialization and teardown
- (id)initWithView:(UIView *)inputView;
- (id)initWithLayer:(CALayer *)inputLayer;
// Layer management
- (CGSize)layerSizeInPixels;
- (void)update;
- (void)updateUsingCurrentTime;
- (void)updateWithTimestamp:(CMTime)frameTime;
@end
- 作用:
GPUImageUIElement
繼承GPUImageOutput類非迹,作為響應(yīng)鏈的源頭。通過CoreGraphics把UIView渲染到圖像纯趋,并通過glTexImage2D
綁定到outputFramebuffer
指定的紋理憎兽,最后通知targets紋理就緒。
16. GPUImageBuffer
- 繼承與屬性
@interface GPUImageBuffer : GPUImageFilter
{
NSMutableArray *bufferedFramebuffers;
}
@property(readwrite, nonatomic) NSUInteger bufferSize;
@end
17. GPUImageFramebuffer
- 繼承與屬性
@interface GPUImageFramebuffer : NSObject
@property(readonly) CGSize size;
@property(readonly) GPUTextureOptions textureOptions;
@property(readonly) GLuint texture;
@property(readonly) BOOL missingFramebuffer;
- 作用:假設(shè)我們自定義一個
OpenGL ES
程序來處理圖片吵冒,那么會有以下幾個步驟:- 初始化
OpenGL ES
環(huán)境纯命,編譯、鏈接頂點著色器和片元著色器痹栖; - 緩存頂點亿汞、紋理坐標(biāo)數(shù)據(jù),傳送圖像數(shù)據(jù)到GPU揪阿;
- 繪制圖元到特定的幀緩存疗我;
- 在幀緩存取出繪制的圖像。
GPUImageFilter
負(fù)責(zé)的是第一南捂、二碍粥、三步。
GPUImageFramebuffer
負(fù)責(zé)是第四步黑毅。
- 初始化
18. GPUImageFramebufferCache
- 繼承與屬性
@interface GPUImageFramebufferCache : NSObject
// Framebuffer management
- (GPUImageFramebuffer *)fetchFramebufferForSize:(CGSize)framebufferSize textureOptions:(GPUTextureOptions)textureOptions onlyTexture:(BOOL)onlyTexture;
- (GPUImageFramebuffer *)fetchFramebufferForSize:(CGSize)framebufferSize onlyTexture:(BOOL)onlyTexture;
- (void)returnFramebufferToCache:(GPUImageFramebuffer *)framebuffer;
- (void)purgeAllUnassignedFramebuffers;
- (void)addFramebufferToActiveImageCaptureList:(GPUImageFramebuffer *)framebuffer;
- (void)removeFramebufferFromActiveImageCaptureList:(GPUImageFramebuffer *)framebuffer;
@end
- 作用:
GPUImageFramebufferCache
是GPUImageFrameBuffer
的管理類。
幾個屬性
-
CacheframebufferCache
緩存字典 -
framebufferTypeCounts
緩存數(shù)量字典 -
activeImageCaptureList
正在讀取Image數(shù)據(jù)的 -
GPUImageFrameBuffer
列表 -
framebufferCacheQueue
緩存隊列
幾個方法
- (NSString *)hashForSize: textureOptions:onlyTexture:;
根據(jù)size钦讳、textureOptions和onlyTexture矿瘦,創(chuàng)建緩存字符串枕面。
緩存字符串+當(dāng)前緩存數(shù)量形成framebufferCache緩存的key。
如果找不到framebufferCache對應(yīng)的數(shù)量缚去,會創(chuàng)建新的緩存潮秘。- (void)returnFramebufferToCache:;
回收緩存。根據(jù)size易结、textureOptions和onlyTexture枕荞,創(chuàng)建緩存字符串,緩存字符串+當(dāng)前緩存數(shù)量形成framebufferCache緩存的key搞动。(之所以會加上數(shù)量躏精,是因為緩存字符串不唯一)- (void)addFramebufferToActiveImageCaptureList:;
-(void)removeFramebufferFromActiveImageCaptureList:
這兩個方法主要用于,當(dāng)newCGImageFromFramebufferContents()
讀取幀緩存圖像數(shù)據(jù)時鹦肿,保持GPUImageFramebuffer
的引用矗烛。并且讀取完數(shù)據(jù)后,在dataProviderUnlockCallback()
方法釋放箩溃。
參考文章
1. GPUImage詳細(xì)解析(九)圖像的輸入輸出和濾鏡通道
2. GPUImageStillCamera 攝像頭-照相
3. iOS GPUImage研究總結(jié)
后記
未完瞭吃,待續(xù)~~~