GPUImage源碼閱讀(四)

概述

GPUImage是一個著名的圖像處理開源庫,它讓你能夠在圖片、視頻、相機上使用GPU加速的濾鏡和其它特效渐夸。與CoreImage框架相比,可以根據(jù)GPUImage提供的接口渔欢,使用自定義的濾鏡墓塌。項目地址:https://github.com/BradLarson/GPUImage
這篇文章主要是閱讀GPUImage框架中的GPUImagePicture、GPUImageView奥额、GPUImageUIElement三個類的源碼苫幢。這三個類與iOS中的圖片加載、圖片顯示垫挨、UI渲染相關(guān)韩肝。在使用GPUImage框架的時候,涉及到對圖片進行濾鏡處理和顯示的時候九榔,基本都會用到這幾個類哀峻。以下是源碼內(nèi)容:
*** GPUImagePicture***
GPUImageView
GPUImageUIElement

實現(xiàn)效果

GPUImagePicture.png
GPUImageUIElement.png

GPUImagePicture

從名稱就可以知道GPUImagePicture是GPUImage框架中處理與圖片相關(guān)的類涡相,它的主要作用是將UIImage或CGImage轉(zhuǎn)化為紋理對象。GPUImagePicture繼承自GPUImageOutput剩蟀,從而可以知道它能夠作為輸出催蝗,由于它沒有實現(xiàn)GPUImageInput協(xié)議,不能處理輸入育特。因此生逸,常常作為響應(yīng)鏈源。

  • 直接alpha與預(yù)乘alpha

使用直接 alpha 描述 RGBA 顏色時且预,顏色的 alpha 值會存儲在 alpha 通道中。例如烙无,若要描述具有 60% 不透明度的紅色锋谐,使用以下值:(255, 0, 0, 255 * 0.6) = (255, 0, 0, 153)。其中153(153 = 255 * 0.6)指示顏色應(yīng)具有 60% 的不透明度截酷。

使用預(yù)乘 alpha 描述 RGBA 顏色時涮拗,每種顏色都會與 alpha 值相乘:(255 * 0.6, 0 * 0.6, 0 * 0.6, 255 * 0.6) = (153, 0, 0, 153)。

  • 初始化方法迂苛。初始化方法比較多三热,因為提供的初始化選項比較多。好處便是自由度比較大三幻,方便自己定制就漾。
// 通過圖片URL初始化
- (id)initWithURL:(NSURL *)url;

// 通過UIImage或CGImage初始化
- (id)initWithImage:(UIImage *)newImageSource;
- (id)initWithCGImage:(CGImageRef)newImageSource;

// 通過UIImage或CGImage、是否平滑縮放輸出來初始化
- (id)initWithImage:(UIImage *)newImageSource smoothlyScaleOutput:(BOOL)smoothlyScaleOutput;
- (id)initWithCGImage:(CGImageRef)newImageSource smoothlyScaleOutput:(BOOL)smoothlyScaleOutput;

// 通過UIImage或CGImage念搬、是否去除預(yù)乘alpha來初始化
- (id)initWithImage:(UIImage *)newImageSource removePremultiplication:(BOOL)removePremultiplication;
- (id)initWithCGImage:(CGImageRef)newImageSource removePremultiplication:(BOOL)removePremultiplication;

// 通過UIImage或CGImage抑堡、是否平滑縮放、是否去除預(yù)乘alpha來初始化
- (id)initWithImage:(UIImage *)newImageSource smoothlyScaleOutput:(BOOL)smoothlyScaleOutput removePremultiplication:(BOOL)removePremultiplication;
- (id)initWithCGImage:(CGImageRef)newImageSource smoothlyScaleOutput:(BOOL)smoothlyScaleOutput removePremultiplication:(BOOL)removePremultiplication;

初始化方法比較多朗徊,但是其它初始化方法都是基于下面的這個初始化方法首妖。因此,只看下面的這個初始化方法爷恳。實現(xiàn)比較復(fù)雜有缆,但是基本思路就是:1、獲取圖片適合的寬高(不能超出OpenGL ES允許的最大紋理寬高)2温亲、如果使用了smoothlyScaleOutput棚壁,需要調(diào)整寬高為接近2的冪的值,調(diào)整后必須重繪铸豁;3灌曙、如果不用重繪,則獲取大小端节芥、alpha等信息在刺;4逆害、需要重繪,則使用CoreGraphics重繪蚣驼;5魄幕、根據(jù)是否需要去除預(yù)乘alpha選項,決定是否去除預(yù)乘alpha颖杏;6纯陨、由調(diào)整后的數(shù)據(jù)生成紋理緩存對象;7留储、根據(jù)shouldSmoothlyScaleOutput選項決定是否生成mipmaps翼抠;8、最后釋放資源获讳。

- (id)initWithCGImage:(CGImageRef)newImageSource smoothlyScaleOutput:(BOOL)smoothlyScaleOutput removePremultiplication:(BOOL)removePremultiplication;
{
    if (!(self = [super init]))
    {
  return nil;
    }
    
    hasProcessedImage = NO;
    self.shouldSmoothlyScaleOutput = smoothlyScaleOutput;
    imageUpdateSemaphore = dispatch_semaphore_create(0);
    dispatch_semaphore_signal(imageUpdateSemaphore);

    // TODO: Dispatch this whole thing asynchronously to move image loading off main thread
    CGFloat widthOfImage = CGImageGetWidth(newImageSource);
    CGFloat heightOfImage = CGImageGetHeight(newImageSource);

    // If passed an empty image reference, CGContextDrawImage will fail in future versions of the SDK.
    NSAssert( widthOfImage > 0 && heightOfImage > 0, @"Passed image must not be empty - it should be at least 1px tall and wide");
    
    pixelSizeOfImage = CGSizeMake(widthOfImage, heightOfImage);
    CGSize pixelSizeToUseForTexture = pixelSizeOfImage;
    
    BOOL shouldRedrawUsingCoreGraphics = NO;
    
    // For now, deal with images larger than the maximum texture size by resizing to be within that limit
    CGSize scaledImageSizeToFitOnGPU = [GPUImageContext sizeThatFitsWithinATextureForSize:pixelSizeOfImage];
    if (!CGSizeEqualToSize(scaledImageSizeToFitOnGPU, pixelSizeOfImage))
    {
        pixelSizeOfImage = scaledImageSizeToFitOnGPU;
        pixelSizeToUseForTexture = pixelSizeOfImage;
        shouldRedrawUsingCoreGraphics = YES;
    }
    
    if (self.shouldSmoothlyScaleOutput)
    {
        // In order to use mipmaps, you need to provide power-of-two textures, so convert to the next largest power of two and stretch to fill
        CGFloat powerClosestToWidth = ceil(log2(pixelSizeOfImage.width));
        CGFloat powerClosestToHeight = ceil(log2(pixelSizeOfImage.height));
        
        pixelSizeToUseForTexture = CGSizeMake(pow(2.0, powerClosestToWidth), pow(2.0, powerClosestToHeight));
        
        shouldRedrawUsingCoreGraphics = YES;
    }
    
    GLubyte *imageData = NULL;
    CFDataRef dataFromImageDataProvider = NULL;
    GLenum format = GL_BGRA;
    BOOL isLitteEndian = YES;
    BOOL alphaFirst = NO;
    BOOL premultiplied = NO;
 
    if (!shouldRedrawUsingCoreGraphics) {
        /* Check that the memory layout is compatible with GL, as we cannot use glPixelStore to
         * tell GL about the memory layout with GLES.
         */
        if (CGImageGetBytesPerRow(newImageSource) != CGImageGetWidth(newImageSource) * 4 ||
            CGImageGetBitsPerPixel(newImageSource) != 32 ||
            CGImageGetBitsPerComponent(newImageSource) != 8)
        {
            shouldRedrawUsingCoreGraphics = YES;
        } else {
            /* Check that the bitmap pixel format is compatible with GL */
            CGBitmapInfo bitmapInfo = CGImageGetBitmapInfo(newImageSource);
            if ((bitmapInfo & kCGBitmapFloatComponents) != 0) {
                /* We don't support float components for use directly in GL */
                shouldRedrawUsingCoreGraphics = YES;
            } else {
                CGBitmapInfo byteOrderInfo = bitmapInfo & kCGBitmapByteOrderMask;
                if (byteOrderInfo == kCGBitmapByteOrder32Little) {
                    /* Little endian, for alpha-first we can use this bitmap directly in GL */
                    CGImageAlphaInfo alphaInfo = bitmapInfo & kCGBitmapAlphaInfoMask;
                    if (alphaInfo != kCGImageAlphaPremultipliedFirst && alphaInfo != kCGImageAlphaFirst &&
                        alphaInfo != kCGImageAlphaNoneSkipFirst) {
                        shouldRedrawUsingCoreGraphics = YES;
                    }
                } else if (byteOrderInfo == kCGBitmapByteOrderDefault || byteOrderInfo == kCGBitmapByteOrder32Big) {
     isLitteEndian = NO;
                    /* Big endian, for alpha-last we can use this bitmap directly in GL */
                    CGImageAlphaInfo alphaInfo = bitmapInfo & kCGBitmapAlphaInfoMask;
                    if (alphaInfo != kCGImageAlphaPremultipliedLast && alphaInfo != kCGImageAlphaLast &&
                        alphaInfo != kCGImageAlphaNoneSkipLast) {
                        shouldRedrawUsingCoreGraphics = YES;
                    } else {
                        /* Can access directly using GL_RGBA pixel format */
      premultiplied = alphaInfo == kCGImageAlphaPremultipliedLast || alphaInfo == kCGImageAlphaPremultipliedLast;
      alphaFirst = alphaInfo == kCGImageAlphaFirst || alphaInfo == kCGImageAlphaPremultipliedFirst;
      format = GL_RGBA;
                    }
                }
            }
        }
    }
    
    //    CFAbsoluteTime elapsedTime, startTime = CFAbsoluteTimeGetCurrent();
    
    if (shouldRedrawUsingCoreGraphics)
    {
        // For resized or incompatible image: redraw
        imageData = (GLubyte *) calloc(1, (int)pixelSizeToUseForTexture.width * (int)pixelSizeToUseForTexture.height * 4);
        
        CGColorSpaceRef genericRGBColorspace = CGColorSpaceCreateDeviceRGB();
        
        CGContextRef imageContext = CGBitmapContextCreate(imageData, (size_t)pixelSizeToUseForTexture.width, (size_t)pixelSizeToUseForTexture.height, 8, (size_t)pixelSizeToUseForTexture.width * 4, genericRGBColorspace,  kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
        //        CGContextSetBlendMode(imageContext, kCGBlendModeCopy); // From Technical Q&A QA1708: http://developer.apple.com/library/ios/#qa/qa1708/_index.html
        CGContextDrawImage(imageContext, CGRectMake(0.0, 0.0, pixelSizeToUseForTexture.width, pixelSizeToUseForTexture.height), newImageSource);
        CGContextRelease(imageContext);
        CGColorSpaceRelease(genericRGBColorspace);
  isLitteEndian = YES;
  alphaFirst = YES;
  premultiplied = YES;
    }
    else
    {
        // Access the raw image bytes directly
        dataFromImageDataProvider = CGDataProviderCopyData(CGImageGetDataProvider(newImageSource));
        imageData = (GLubyte *)CFDataGetBytePtr(dataFromImageDataProvider);
    }
 
 if (removePremultiplication && premultiplied) {
  NSUInteger totalNumberOfPixels = round(pixelSizeToUseForTexture.width * pixelSizeToUseForTexture.height);
  uint32_t *pixelP = (uint32_t *)imageData;
  uint32_t pixel;
  CGFloat  srcR, srcG, srcB, srcA;

  for (NSUInteger idx=0; idx<totalNumberOfPixels; idx++, pixelP++) {
   pixel = isLitteEndian ? CFSwapInt32LittleToHost(*pixelP) : CFSwapInt32BigToHost(*pixelP);

   if (alphaFirst) {
    srcA = (CGFloat)((pixel & 0xff000000) >> 24) / 255.0f;
   }
   else {
    srcA = (CGFloat)(pixel & 0x000000ff) / 255.0f;
    pixel >>= 8;
   }

   srcR = (CGFloat)((pixel & 0x00ff0000) >> 16) / 255.0f;
   srcG = (CGFloat)((pixel & 0x0000ff00) >> 8) / 255.0f;
   srcB = (CGFloat)(pixel & 0x000000ff) / 255.0f;
   
   srcR /= srcA; srcG /= srcA; srcB /= srcA;
   
   pixel = (uint32_t)(srcR * 255.0) << 16;
   pixel |= (uint32_t)(srcG * 255.0) << 8;
   pixel |= (uint32_t)(srcB * 255.0);

   if (alphaFirst) {
    pixel |= (uint32_t)(srcA * 255.0) << 24;
   }
   else {
    pixel <<= 8;
    pixel |= (uint32_t)(srcA * 255.0);
   }
   *pixelP = isLitteEndian ? CFSwapInt32HostToLittle(pixel) : CFSwapInt32HostToBig(pixel);
  }
 }
 
    //    elapsedTime = (CFAbsoluteTimeGetCurrent() - startTime) * 1000.0;
    //    NSLog(@"Core Graphics drawing time: %f", elapsedTime);
    
    //    CGFloat currentRedTotal = 0.0f, currentGreenTotal = 0.0f, currentBlueTotal = 0.0f, currentAlphaTotal = 0.0f;
    // NSUInteger totalNumberOfPixels = round(pixelSizeToUseForTexture.width * pixelSizeToUseForTexture.height);
    //
    //    for (NSUInteger currentPixel = 0; currentPixel < totalNumberOfPixels; currentPixel++)
    //    {
    //        currentBlueTotal += (CGFloat)imageData[(currentPixel * 4)] / 255.0f;
    //        currentGreenTotal += (CGFloat)imageData[(currentPixel * 4) + 1] / 255.0f;
    //        currentRedTotal += (CGFloat)imageData[(currentPixel * 4 + 2)] / 255.0f;
    //        currentAlphaTotal += (CGFloat)imageData[(currentPixel * 4) + 3] / 255.0f;
    //    }
    //
    //    NSLog(@"Debug, average input image red: %f, green: %f, blue: %f, alpha: %f", currentRedTotal / (CGFloat)totalNumberOfPixels, currentGreenTotal / (CGFloat)totalNumberOfPixels, currentBlueTotal / (CGFloat)totalNumberOfPixels, currentAlphaTotal / (CGFloat)totalNumberOfPixels);
    
    runSynchronouslyOnVideoProcessingQueue(^{
        [GPUImageContext useImageProcessingContext];
        
        outputFramebuffer = [[GPUImageContext sharedFramebufferCache] fetchFramebufferForSize:pixelSizeToUseForTexture onlyTexture:YES];
        [outputFramebuffer disableReferenceCounting];

        glBindTexture(GL_TEXTURE_2D, [outputFramebuffer texture]);
        if (self.shouldSmoothlyScaleOutput)
        {
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
        }
        // no need to use self.outputTextureOptions here since pictures need this texture formats and type
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, (int)pixelSizeToUseForTexture.width, (int)pixelSizeToUseForTexture.height, 0, format, GL_UNSIGNED_BYTE, imageData);
        
        if (self.shouldSmoothlyScaleOutput)
        {
            glGenerateMipmap(GL_TEXTURE_2D);
        }
        glBindTexture(GL_TEXTURE_2D, 0);
    });
    
    if (shouldRedrawUsingCoreGraphics)
    {
        free(imageData);
    }
    else
    {
        if (dataFromImageDataProvider)
        {
            CFRelease(dataFromImageDataProvider);
        }
    }
    
    return self;
}
  • 其它方法阴颖。這些方法主要是與圖片處理相關(guān)。
// Image rendering
- (void)processImage;
- (CGSize)outputImageSize;

- (BOOL)processImageWithCompletionHandler:(void (^)(void))completion;
- (void)processImageUpToFilter:(GPUImageOutput<GPUImageInput> *)finalFilterInChain withCompletionHandler:(void (^)(UIImage *processedImage))block;
// 處理圖片
- (void)processImage;
{
    [self processImageWithCompletionHandler:nil];
}

// 輸出圖片大小丐膝,由于圖像大小可能被調(diào)整(詳見初始化方法)量愧。因此,提供了獲取圖像大小的方法帅矗。
- (CGSize)outputImageSize;
{
    return pixelSizeOfImage;
}

// 處理圖片偎肃,可以傳入處理完回調(diào)的block
- (BOOL)processImageWithCompletionHandler:(void (^)(void))completion;
{
    hasProcessedImage = YES;
    
    //    dispatch_semaphore_wait(imageUpdateSemaphore, DISPATCH_TIME_FOREVER);
    
    // 如果計數(shù)器小于1便立即返回。計數(shù)器大于等于1的時候浑此,使計數(shù)器減1累颂,并且往下執(zhí)行
    if (dispatch_semaphore_wait(imageUpdateSemaphore, DISPATCH_TIME_NOW) != 0)
    {
        return NO;
    }
    
    // 傳遞Framebuffer給所有targets處理
    runAsynchronouslyOnVideoProcessingQueue(^{        
        for (id<GPUImageInput> currentTarget in targets)
        {
            NSInteger indexOfObject = [targets indexOfObject:currentTarget];
            NSInteger textureIndexOfTarget = [[targetTextureIndices objectAtIndex:indexOfObject] integerValue];
            
            [currentTarget setCurrentlyReceivingMonochromeInput:NO];
            [currentTarget setInputSize:pixelSizeOfImage atIndex:textureIndexOfTarget];
            [currentTarget setInputFramebuffer:outputFramebuffer atIndex:textureIndexOfTarget];
            [currentTarget newFrameReadyAtTime:kCMTimeIndefinite atIndex:textureIndexOfTarget];
        }
        
        // 執(zhí)行完,計數(shù)器加1
        dispatch_semaphore_signal(imageUpdateSemaphore);
        
        // 有block尤勋,執(zhí)行block
        if (completion != nil) {
            completion();
        }
    });
    
    return YES;
}

// 由響應(yīng)鏈的final filter生成UIImage圖像
- (void)processImageUpToFilter:(GPUImageOutput<GPUImageInput> *)finalFilterInChain withCompletionHandler:(void (^)(UIImage *processedImage))block;
{
    [finalFilterInChain useNextFrameForImageCapture];
    [self processImageWithCompletionHandler:^{
        UIImage *imageFromFilter = [finalFilterInChain imageFromCurrentFramebuffer];
        block(imageFromFilter);
    }];
}

GPUImageView

從名稱就可以知道GPUImageView是GPUImage框架中顯示圖片相關(guān)的類喘落。GPUImageView實現(xiàn)了GPUImageInput協(xié)議,從而可以知道它能夠接受GPUImageFramebuffer的輸入最冰。因此瘦棋,常常作為響應(yīng)鏈的終端節(jié)點,用于顯示處理后的幀緩存暖哨。GPUImageView這個涉及了比較多的OpenGL ES的知識赌朋,在這里不會說太多OpenGL ES的知識。如果不太了解篇裁,歡迎閱讀我的 OpenGL ES入門專題沛慢。

  • 初始化
- (id)initWithFrame:(CGRect)frame;
-(id)initWithCoder:(NSCoder *)coder;

初始化的時候主要是以下幾個方面的工作:1、設(shè)置OpenGL ES的相關(guān)屬性达布;2团甲、創(chuàng)建著色器程序;3黍聂、獲取屬性變量躺苦、統(tǒng)一變量的位置身腻;4、設(shè)置清屏顏色匹厘;5嘀趟、創(chuàng)建默認的幀緩存、渲染緩存愈诚,用于圖像的顯示她按;6、根據(jù)填充模式調(diào)整頂點坐標(biāo)炕柔。

- (id)initWithFrame:(CGRect)frame
{
    if (!(self = [super initWithFrame:frame]))
    {
        return nil;
    }
    
    [self commonInit];
    
    return self;
}

-(id)initWithCoder:(NSCoder *)coder
{
    if (!(self = [super initWithCoder:coder])) 
    {
        return nil;
    }

    [self commonInit];

    return self;
}

- (void)commonInit;
{
    // Set scaling to account for Retina display    
    if ([self respondsToSelector:@selector(setContentScaleFactor:)])
    {
        self.contentScaleFactor = [[UIScreen mainScreen] scale];
    }

    inputRotation = kGPUImageNoRotation;
    self.opaque = YES;
    self.hidden = NO;
    CAEAGLLayer *eaglLayer = (CAEAGLLayer *)self.layer;
    eaglLayer.opaque = YES;
    eaglLayer.drawableProperties = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:NO], kEAGLDrawablePropertyRetainedBacking, kEAGLColorFormatRGBA8, kEAGLDrawablePropertyColorFormat, nil];

    self.enabled = YES;
    
    runSynchronouslyOnVideoProcessingQueue(^{
        [GPUImageContext useImageProcessingContext];
        
        displayProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImagePassthroughFragmentShaderString];
        if (!displayProgram.initialized)
        {
            [displayProgram addAttribute:@"position"];
            [displayProgram addAttribute:@"inputTextureCoordinate"];
            
            if (![displayProgram link])
            {
                NSString *progLog = [displayProgram programLog];
                NSLog(@"Program link log: %@", progLog);
                NSString *fragLog = [displayProgram fragmentShaderLog];
                NSLog(@"Fragment shader compile log: %@", fragLog);
                NSString *vertLog = [displayProgram vertexShaderLog];
                NSLog(@"Vertex shader compile log: %@", vertLog);
                displayProgram = nil;
                NSAssert(NO, @"Filter shader link failed");
            }
        }
        
        displayPositionAttribute = [displayProgram attributeIndex:@"position"];
        displayTextureCoordinateAttribute = [displayProgram attributeIndex:@"inputTextureCoordinate"];
        displayInputTextureUniform = [displayProgram uniformIndex:@"inputImageTexture"]; // This does assume a name of "inputTexture" for the fragment shader

        [GPUImageContext setActiveShaderProgram:displayProgram];
        glEnableVertexAttribArray(displayPositionAttribute);
        glEnableVertexAttribArray(displayTextureCoordinateAttribute);
        
        [self setBackgroundColorRed:0.0 green:0.0 blue:0.0 alpha:1.0];
        _fillMode = kGPUImageFillModePreserveAspectRatio;
        [self createDisplayFramebuffer];
    });
}
  • 方法列表
// 設(shè)置背景顏色
- (void)setBackgroundColorRed:(GLfloat)redComponent green:(GLfloat)greenComponent blue:(GLfloat)blueComponent alpha:(GLfloat)alphaComponent;

// 該方法未實現(xiàn)
- (void)setCurrentlyReceivingMonochromeInput:(BOOL)newValue;

方法實現(xiàn)酌泰。方法實現(xiàn)比較簡單,主要看以下幾個方法匕累。

- (void)setBackgroundColorRed:(GLfloat)redComponent green:(GLfloat)greenComponent blue:(GLfloat)blueComponent alpha:(GLfloat)alphaComponent;
{
    backgroundColorRed = redComponent;
    backgroundColorGreen = greenComponent;
    backgroundColorBlue = blueComponent;
    backgroundColorAlpha = alphaComponent;
}

- (void)setCurrentlyReceivingMonochromeInput:(BOOL)newValue;
{
    
}

// 根據(jù)旋轉(zhuǎn)模式獲取紋理坐標(biāo)
+ (const GLfloat *)textureCoordinatesForRotation:(GPUImageRotationMode)rotationMode;
{
//    static const GLfloat noRotationTextureCoordinates[] = {
//        0.0f, 0.0f,
//        1.0f, 0.0f,
//        0.0f, 1.0f,
//        1.0f, 1.0f,
//    };
    
    static const GLfloat noRotationTextureCoordinates[] = {
        0.0f, 1.0f,
        1.0f, 1.0f,
        0.0f, 0.0f,
        1.0f, 0.0f,
    };

    static const GLfloat rotateRightTextureCoordinates[] = {
        1.0f, 1.0f,
        1.0f, 0.0f,
        0.0f, 1.0f,
        0.0f, 0.0f,
    };

    static const GLfloat rotateLeftTextureCoordinates[] = {
        0.0f, 0.0f,
        0.0f, 1.0f,
        1.0f, 0.0f,
        1.0f, 1.0f,
    };
        
    static const GLfloat verticalFlipTextureCoordinates[] = {
        0.0f, 0.0f,
        1.0f, 0.0f,
        0.0f, 1.0f,
        1.0f, 1.0f,
    };
    
    static const GLfloat horizontalFlipTextureCoordinates[] = {
        1.0f, 1.0f,
        0.0f, 1.0f,
        1.0f, 0.0f,
        0.0f, 0.0f,
    };
    
    static const GLfloat rotateRightVerticalFlipTextureCoordinates[] = {
        1.0f, 0.0f,
        1.0f, 1.0f,
        0.0f, 0.0f,
        0.0f, 1.0f,
    };
    
    static const GLfloat rotateRightHorizontalFlipTextureCoordinates[] = {
        0.0f, 1.0f,
        0.0f, 0.0f,
        1.0f, 1.0f,
        1.0f, 0.0f,
    };

    static const GLfloat rotate180TextureCoordinates[] = {
        1.0f, 0.0f,
        0.0f, 0.0f,
        1.0f, 1.0f,
        0.0f, 1.0f,
    };
    
    switch(rotationMode)
    {
        case kGPUImageNoRotation: return noRotationTextureCoordinates;
        case kGPUImageRotateLeft: return rotateLeftTextureCoordinates;
        case kGPUImageRotateRight: return rotateRightTextureCoordinates;
        case kGPUImageFlipVertical: return verticalFlipTextureCoordinates;
        case kGPUImageFlipHorizonal: return horizontalFlipTextureCoordinates;
        case kGPUImageRotateRightFlipVertical: return rotateRightVerticalFlipTextureCoordinates;
        case kGPUImageRotateRightFlipHorizontal: return rotateRightHorizontalFlipTextureCoordinates;
        case kGPUImageRotate180: return rotate180TextureCoordinates;
    }
}

// 覆蓋父類的方法宫莱,作用是負責(zé)OpenGL的圖形繪制,并顯示在屏幕上
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
{
    runSynchronouslyOnVideoProcessingQueue(^{
        [GPUImageContext setActiveShaderProgram:displayProgram];
        [self setDisplayFramebuffer];
        
        // 清屏
        glClearColor(backgroundColorRed, backgroundColorGreen, backgroundColorBlue, backgroundColorAlpha);
        glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
       
        glActiveTexture(GL_TEXTURE4);
        glBindTexture(GL_TEXTURE_2D, [inputFramebufferForDisplay texture]);
        glUniform1i(displayInputTextureUniform, 4);
        
        glVertexAttribPointer(displayPositionAttribute, 2, GL_FLOAT, 0, 0, imageVertices);
        glVertexAttribPointer(displayTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, [GPUImageView textureCoordinatesForRotation:inputRotation]);
        
        // 繪制
        glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
        
        // 顯示
        [self presentFramebuffer];
        [inputFramebufferForDisplay unlock];
        inputFramebufferForDisplay = nil;
    });
}

// 顯示幀緩存
- (void)presentFramebuffer;
{
    glBindRenderbuffer(GL_RENDERBUFFER, displayRenderbuffer);
    [[GPUImageContext sharedImageProcessingContext] presentBufferForDisplay];
}

GPUImageUIElement

與GPUImagePicture類似可以作為響應(yīng)鏈源哩罪。與GPUImagePicture不同的是,它的數(shù)據(jù)不是來自圖片巡验,而是來自于UIView或CALayer的渲染結(jié)果际插,類似于對UIView或CALayer截圖。GPUImageUIElement繼承自GPUImageOutput显设,從而可以知道它能夠作為輸出框弛,由于它沒有實現(xiàn)GPUImageInput協(xié)議,不能處理輸入捕捂。

  • 初始化
- (id)initWithView:(UIView *)inputView;
- (id)initWithLayer:(CALayer *)inputLayer;

通過UIView或CALayer進行初始化瑟枫,初始化的過程中調(diào)用 [layer renderInContext:imageContext] 進行渲染,渲染后便生成紋理對象指攒。

- (id)initWithView:(UIView *)inputView;
{
    if (!(self = [super init]))
    {
        return nil;
    }
    
    view = inputView;
    layer = inputView.layer;

    previousLayerSizeInPixels = CGSizeZero;
    [self update];
    
    return self;
}

- (id)initWithLayer:(CALayer *)inputLayer;
{
    if (!(self = [super init]))
    {
        return nil;
    }
    
    view = nil;
    layer = inputLayer;

    previousLayerSizeInPixels = CGSizeZero;
    [self update];

    return self;
}

- (void)update;
{
    [self updateWithTimestamp:kCMTimeIndefinite];
}

- (void)updateWithTimestamp:(CMTime)frameTime;
{
    [GPUImageContext useImageProcessingContext];
    
    CGSize layerPixelSize = [self layerSizeInPixels];
    
    GLubyte *imageData = (GLubyte *) calloc(1, (int)layerPixelSize.width * (int)layerPixelSize.height * 4);
    
    CGColorSpaceRef genericRGBColorspace = CGColorSpaceCreateDeviceRGB();    
    CGContextRef imageContext = CGBitmapContextCreate(imageData, (int)layerPixelSize.width, (int)layerPixelSize.height, 8, (int)layerPixelSize.width * 4, genericRGBColorspace,  kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
//    CGContextRotateCTM(imageContext, M_PI_2);
 CGContextTranslateCTM(imageContext, 0.0f, layerPixelSize.height);
    CGContextScaleCTM(imageContext, layer.contentsScale, -layer.contentsScale);
    //        CGContextSetBlendMode(imageContext, kCGBlendModeCopy); // From Technical Q&A QA1708: http://developer.apple.com/library/ios/#qa/qa1708/_index.html
    
    [layer renderInContext:imageContext];
    
    CGContextRelease(imageContext);
    CGColorSpaceRelease(genericRGBColorspace);
    
    // TODO: This may not work
    outputFramebuffer = [[GPUImageContext sharedFramebufferCache] fetchFramebufferForSize:layerPixelSize textureOptions:self.outputTextureOptions onlyTexture:YES];

    glBindTexture(GL_TEXTURE_2D, [outputFramebuffer texture]);
    // no need to use self.outputTextureOptions here, we always need these texture options
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, (int)layerPixelSize.width, (int)layerPixelSize.height, 0, GL_BGRA, GL_UNSIGNED_BYTE, imageData);
    
    free(imageData);
    
    for (id<GPUImageInput> currentTarget in targets)
    {
        if (currentTarget != self.targetToIgnoreForUpdates)
        {
            NSInteger indexOfObject = [targets indexOfObject:currentTarget];
            NSInteger textureIndexOfTarget = [[targetTextureIndices objectAtIndex:indexOfObject] integerValue];
            
            [currentTarget setInputSize:layerPixelSize atIndex:textureIndexOfTarget];
            [currentTarget newFrameReadyAtTime:frameTime atIndex:textureIndexOfTarget];
        }
    }    
}
  • 其它方法
- (CGSize)layerSizeInPixels;
- (void)update;
- (void)updateUsingCurrentTime;
- (void)updateWithTimestamp:(CMTime)frameTime;

其它方法主要是與截屏生成紋理對象并傳給所有targets處理相關(guān)慷妙。

// 獲取像素大小
- (CGSize)layerSizeInPixels;
{
    CGSize pointSize = layer.bounds.size;
    return CGSizeMake(layer.contentsScale * pointSize.width, layer.contentsScale * pointSize.height);
}

// 更新方法
- (void)update;
{
    [self updateWithTimestamp:kCMTimeIndefinite];
}

// 使用當(dāng)前時間的更新方法
- (void)updateUsingCurrentTime;
{
    if(CMTIME_IS_INVALID(time)) {
        time = CMTimeMakeWithSeconds(0, 600);
        actualTimeOfLastUpdate = [NSDate timeIntervalSinceReferenceDate];
    } else {
        NSTimeInterval now = [NSDate timeIntervalSinceReferenceDate];
        NSTimeInterval diff = now - actualTimeOfLastUpdate;
        time = CMTimeAdd(time, CMTimeMakeWithSeconds(diff, 600));
        actualTimeOfLastUpdate = now;
    }

    [self updateWithTimestamp:time];
}

// 帶時間的更新方法
- (void)updateWithTimestamp:(CMTime)frameTime;
{
    [GPUImageContext useImageProcessingContext];
    
    CGSize layerPixelSize = [self layerSizeInPixels];
    
    GLubyte *imageData = (GLubyte *) calloc(1, (int)layerPixelSize.width * (int)layerPixelSize.height * 4);
    
    CGColorSpaceRef genericRGBColorspace = CGColorSpaceCreateDeviceRGB();    
    CGContextRef imageContext = CGBitmapContextCreate(imageData, (int)layerPixelSize.width, (int)layerPixelSize.height, 8, (int)layerPixelSize.width * 4, genericRGBColorspace,  kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
//    CGContextRotateCTM(imageContext, M_PI_2);
 CGContextTranslateCTM(imageContext, 0.0f, layerPixelSize.height);
    CGContextScaleCTM(imageContext, layer.contentsScale, -layer.contentsScale);
    //        CGContextSetBlendMode(imageContext, kCGBlendModeCopy); // From Technical Q&A QA1708: http://developer.apple.com/library/ios/#qa/qa1708/_index.html
    
    [layer renderInContext:imageContext];
    
    CGContextRelease(imageContext);
    CGColorSpaceRelease(genericRGBColorspace);
    
    // TODO: This may not work
    outputFramebuffer = [[GPUImageContext sharedFramebufferCache] fetchFramebufferForSize:layerPixelSize textureOptions:self.outputTextureOptions onlyTexture:YES];

    glBindTexture(GL_TEXTURE_2D, [outputFramebuffer texture]);
    // no need to use self.outputTextureOptions here, we always need these texture options
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, (int)layerPixelSize.width, (int)layerPixelSize.height, 0, GL_BGRA, GL_UNSIGNED_BYTE, imageData);
    
    free(imageData);
    
    for (id<GPUImageInput> currentTarget in targets)
    {
        if (currentTarget != self.targetToIgnoreForUpdates)
        {
            NSInteger indexOfObject = [targets indexOfObject:currentTarget];
            NSInteger textureIndexOfTarget = [[targetTextureIndices objectAtIndex:indexOfObject] integerValue];
            
            [currentTarget setInputSize:layerPixelSize atIndex:textureIndexOfTarget];
            [currentTarget newFrameReadyAtTime:frameTime atIndex:textureIndexOfTarget];
        }
    }    
}

實例代碼

  • GPUImagePicture與GPUImageView。效果見 GPUImagePicture.png允悦。
@interface ViewController ()
@property (weak, nonatomic) IBOutlet GPUImageView *imageView;
@end

@implementation ViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    
    [_imageView setBackgroundColorRed:1.0 green:1.0 blue:1.0 alpha:1.0];
    
    GPUImagePicture *picture = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:@"1.jpg"]];
    GPUImageGrayscaleFilter *filter = [[GPUImageGrayscaleFilter alloc] init];
    [picture addTarget:filter];
    [filter addTarget:_imageView];
    [filter useNextFrameForImageCapture];
    
    [picture processImage];
}
  • GPUImageUIElement與GPUImageView膝擂。效果見 GPUImageUIElement.png
@interface SecondViewController ()
@property (weak, nonatomic) IBOutlet GPUImageView *imageView;
@property (weak, nonatomic) IBOutlet UIView *bgView;
@end

@implementation SecondViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    
    [_imageView setBackgroundColorRed:1.0 green:1.0 blue:1.0 alpha:1.0];
    
    GPUImageUIElement *element = [[GPUImageUIElement alloc] initWithView:_bgView];
    GPUImageHueFilter *filter = [[GPUImageHueFilter alloc] init];
    [element addTarget:filter];
    [filter addTarget:_imageView];
    [filter useNextFrameForImageCapture];
    
    [element update];
}

總結(jié)

GPUImagePicture隙弛、 GPUImageView架馋、GPUImageUIElement 這幾類在處理圖片、處理截屏全闷、以及顯示圖片方面會經(jīng)常用到叉寂。因此,熟悉這幾類對了解GPUImage框架有著重要的作用总珠。

源碼地址:GPUImage源碼閱讀系列 https://github.com/QinminiOS/GPUImage
系列文章地址:GPUImage源碼閱讀 http://www.reibang.com/nb/11749791

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末屏鳍,一起剝皮案震驚了整個濱河市勘纯,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌孕蝉,老刑警劉巖屡律,帶你破解...
    沈念sama閱讀 218,036評論 6 506
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場離奇詭異降淮,居然都是意外死亡超埋,警方通過查閱死者的電腦和手機,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 93,046評論 3 395
  • 文/潘曉璐 我一進店門佳鳖,熙熙樓的掌柜王于貴愁眉苦臉地迎上來霍殴,“玉大人,你說我怎么就攤上這事系吩±赐ィ” “怎么了?”我有些...
    開封第一講書人閱讀 164,411評論 0 354
  • 文/不壞的土叔 我叫張陵穿挨,是天一觀的道長月弛。 經(jīng)常有香客問我,道長科盛,這世上最難降的妖魔是什么帽衙? 我笑而不...
    開封第一講書人閱讀 58,622評論 1 293
  • 正文 為了忘掉前任,我火速辦了婚禮贞绵,結(jié)果婚禮上厉萝,老公的妹妹穿的比我還像新娘。我一直安慰自己榨崩,他們只是感情好谴垫,可當(dāng)我...
    茶點故事閱讀 67,661評論 6 392
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著母蛛,像睡著了一般翩剪。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上彩郊,一...
    開封第一講書人閱讀 51,521評論 1 304
  • 那天肢专,我揣著相機與錄音,去河邊找鬼焦辅。 笑死博杖,一個胖子當(dāng)著我的面吹牛,可吹牛的內(nèi)容都是我干的筷登。 我是一名探鬼主播剃根,決...
    沈念sama閱讀 40,288評論 3 418
  • 文/蒼蘭香墨 我猛地睜開眼,長吁一口氣:“原來是場噩夢啊……” “哼前方!你這毒婦竟也來了狈醉?” 一聲冷哼從身側(cè)響起,我...
    開封第一講書人閱讀 39,200評論 0 276
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎嘶炭,沒想到半個月后眨猎,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體睡陪,經(jīng)...
    沈念sama閱讀 45,644評論 1 314
  • 正文 獨居荒郊野嶺守林人離奇死亡兰迫,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點故事閱讀 37,837評論 3 336
  • 正文 我和宋清朗相戀三年,在試婚紗的時候發(fā)現(xiàn)自己被綠了。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片府蔗。...
    茶點故事閱讀 39,953評論 1 348
  • 序言:一個原本活蹦亂跳的男人離奇死亡姓赤,死狀恐怖仲吏,靈堂內(nèi)的尸體忽然破棺而出裹唆,到底是詐尸還是另有隱情,我是刑警寧澤劳坑,帶...
    沈念sama閱讀 35,673評論 5 346
  • 正文 年R本政府宣布距芬,位于F島的核電站框仔,受9級特大地震影響离斩,放射性物質(zhì)發(fā)生泄漏纵朋。R本人自食惡果不足惜操软,卻給世界環(huán)境...
    茶點故事閱讀 41,281評論 3 329
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望寻咒。 院中可真熱鬧,春花似錦翔悠、人聲如沸野芒。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,889評論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽。三九已至摇锋,卻和暖如春丹拯,著一層夾襖步出監(jiān)牢的瞬間,已是汗流浹背荸恕。 一陣腳步聲響...
    開封第一講書人閱讀 33,011評論 1 269
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機就差點兒被人妖公主榨干…… 1. 我叫王不留剑刑,地道東北人。 一個月前我還...
    沈念sama閱讀 48,119評論 3 370
  • 正文 我出身青樓素挽,卻偏偏與公主長得像,于是被迫代替她去往敵國和親预明。 傳聞我的和親對象是個殘疾皇子撰糠,可洞房花燭夜當(dāng)晚...
    茶點故事閱讀 44,901評論 2 355

推薦閱讀更多精彩內(nèi)容

  • 本篇主要講解GPUImage底層是如何渲染的,GPUImage底層使用的是OPENGL,操控GPU來實現(xiàn)屏幕展示 ...
    wo不懂閱讀 2,244評論 4 4
  • 概述 GPUImage是一個著名的圖像處理開源庫,它讓你能夠在圖片辩昆、視頻阅酪、相機上使用GPU加速的濾鏡和其它特效。與...
    秦明Qinmin閱讀 4,896評論 4 22
  • 1. 概述 Android通過使用Open Graphics Library(OpenGL?)提供了對高性能2D和...
    小蕓論閱讀 7,094評論 1 19
  • 概述 GPUImage是一個著名的圖像處理開源庫汁针,它讓你能夠在圖片术辐、視頻、相機上使用GPU加速的濾鏡和其它特效施无。與...
    秦明Qinmin閱讀 3,479評論 2 7
  • 別人的人生猾骡,看起來都那么美好瑞躺。 畢業(yè)之后在家鄉(xiāng)發(fā)展的朋友,現(xiàn)在出門已經(jīng)開上了汽車兴想,住著三層小別墅幢哨,一有空就帶著家人...
    文藝小麥閱讀 4,905評論 0 8