這個(gè)類其實(shí)是UIImage的一個(gè)分類祥国,主要用來解碼UIImage,那么為什么要對UIImage進(jìn)行解碼呢吓坚?難道不能直接使用嗎仪缸?
其實(shí)不解碼也是可以使用的遗菠,假如說我們通過imageNamed:來加載image尝蠕,系統(tǒng)默認(rèn)會(huì)在主線程立即進(jìn)行圖片的解碼工作烘豌。這一過程就是把image解碼成可供控件直接使用的位圖。
當(dāng)在主線程調(diào)用了大量的imageNamed:方法后看彼,就會(huì)產(chǎn)生卡頓了廊佩。為了解決這個(gè)問題我們有兩種比較簡單的處理方法:
我們不使用imageNamed:加載圖片囚聚,使用其他的方法,比如imageWithContentsOfFile:
從網(wǎng)絡(luò)上下載回來的圖片也不能直接在UI控件上顯示标锄,所以sdwebimage選擇自己解碼圖片顽铸,而且sd將解碼操作基本都放在了子線程來執(zhí)行。
static const size_t kBytesPerPixel = 4; //每個(gè)像素占用四個(gè)字節(jié)
static const size_t kBitsPerComponent = 8;
這個(gè)方法主要是單純的解碼圖片料皇,方便GPU直接渲染谓松。
+ (nullable UIImage *)decodedImageWithImage:(nullable UIImage *)image {
if (![UIImage shouldDecodeImage:image]) {
return image;
}
// autorelease the bitmap context and all vars to help system to free memory when there are memory warning.
// on iOS7, do not forget to call [[SDImageCache sharedImageCache] clearMemory];
@autoreleasepool{
CGImageRef imageRef = image.CGImage;
CGColorSpaceRef colorspaceRef = [UIImage colorSpaceForImageRef:imageRef];
size_t width = CGImageGetWidth(imageRef);
size_t height = CGImageGetHeight(imageRef);
size_t bytesPerRow = kBytesPerPixel * width;
// kCGImageAlphaNone is not supported in CGBitmapContextCreate.
// Since the original image here has no alpha info, use kCGImageAlphaNoneSkipLast
// to create bitmap graphics contexts without alpha info.
CGContextRef context = CGBitmapContextCreate(NULL,
width,
height,
kBitsPerComponent,
bytesPerRow,
colorspaceRef,
kCGBitmapByteOrderDefault|kCGImageAlphaNoneSkipLast);
if (context == NULL) {
return image;
}
// Draw the image into the context and retrieve the new bitmap image without alpha
CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGImageRef imageRefWithoutAlpha = CGBitmapContextCreateImage(context);
UIImage *imageWithoutAlpha = [UIImage imageWithCGImage:imageRefWithoutAlpha
scale:image.scale
orientation:image.imageOrientation];
CGContextRelease(context);
CGImageRelease(imageRefWithoutAlpha);
return imageWithoutAlpha;
}
}
static const CGFloat kDestImageSizeMB = 60.0f; //大圖片scaleDown后制定的最大開銷
static const CGFloat kSourceImageTileSizeMB = 20.0f; // 圖片分塊后每個(gè)塊大小
static const CGFloat kBytesPerMB = 1024.0f * 1024.0f; // 1MB包含多少個(gè)字節(jié)
static const CGFloat kPixelsPerMB = kBytesPerMB / kBytesPerPixel; // 1MB大小能包含多少像素
static const CGFloat kDestTotalPixels = kDestImageSizeMB * kPixelsPerMB; // 圖片scaleDown后的最大像素?cái)?shù)
static const CGFloat kTileTotalPixels = kSourceImageTileSizeMB * kPixelsPerMB; // 圖片scaleDown過程中每個(gè)塊的像素?cái)?shù)量
將圖片解碼并進(jìn)行scaleDown的方法:
主要處理流程:
1.先算出imageScale,也就是scaleDown的比例践剂;
2.開辟內(nèi)存空間鬼譬,這個(gè)內(nèi)存空間一定會(huì)小于60MB;
3.將原圖片分成若干塊逊脯,然后分別渲染到目標(biāo)畫布的對應(yīng)塊上优质。
+ (nullable UIImage *)decodedAndScaledDownImageWithImage:(nullable UIImage *)image {
if (![UIImage shouldDecodeImage:image]) {
return image;
}
if (![UIImage shouldScaleDownImage:image]) {
return [UIImage decodedImageWithImage:image];
}
CGContextRef destContext;
// autorelease the bitmap context and all vars to help system to free memory when there are memory warning.
// on iOS7, do not forget to call [[SDImageCache sharedImageCache] clearMemory];
@autoreleasepool {
CGImageRef sourceImageRef = image.CGImage;
CGSize sourceResolution = CGSizeZero;
sourceResolution.width = CGImageGetWidth(sourceImageRef);
sourceResolution.height = CGImageGetHeight(sourceImageRef);
float sourceTotalPixels = sourceResolution.width * sourceResolution.height;
// Determine the scale ratio to apply to the input image
// that results in an output image of the defined size.
// see kDestImageSizeMB, and how it relates to destTotalPixels.
float imageScale = kDestTotalPixels / sourceTotalPixels;
CGSize destResolution = CGSizeZero;
destResolution.width = (int)(sourceResolution.width*imageScale);
destResolution.height = (int)(sourceResolution.height*imageScale);
// current color space
CGColorSpaceRef colorspaceRef = [UIImage colorSpaceForImageRef:sourceImageRef];
size_t bytesPerRow = kBytesPerPixel * destResolution.width;
// Allocate enough pixel data to hold the output image.
void* destBitmapData = malloc( bytesPerRow * destResolution.height );
if (destBitmapData == NULL) {
return image;
}
// kCGImageAlphaNone is not supported in CGBitmapContextCreate.
// Since the original image here has no alpha info, use kCGImageAlphaNoneSkipLast
// to create bitmap graphics contexts without alpha info.
destContext = CGBitmapContextCreate(destBitmapData,
destResolution.width,
destResolution.height,
kBitsPerComponent,
bytesPerRow,
colorspaceRef,
kCGBitmapByteOrderDefault|kCGImageAlphaNoneSkipLast);
if (destContext == NULL) {
free(destBitmapData);
return image;
}
CGContextSetInterpolationQuality(destContext, kCGInterpolationHigh);
// Now define the size of the rectangle to be used for the
// incremental blits from the input image to the output image.
// we use a source tile width equal to the width of the source
// image due to the way that iOS retrieves image data from disk.
// iOS must decode an image from disk in full width 'bands', even
// if current graphics context is clipped to a subrect within that
// band. Therefore we fully utilize all of the pixel data that results
// from a decoding opertion by achnoring our tile size to the full
// width of the input image.
CGRect sourceTile = CGRectZero;
sourceTile.size.width = sourceResolution.width;
// The source tile height is dynamic. Since we specified the size
// of the source tile in MB, see how many rows of pixels high it
// can be given the input image width.
sourceTile.size.height = (int)(kTileTotalPixels / sourceTile.size.width );
sourceTile.origin.x = 0.0f;
// The output tile is the same proportions as the input tile, but
// scaled to image scale.
CGRect destTile;
destTile.size.width = destResolution.width;
destTile.size.height = sourceTile.size.height * imageScale;
destTile.origin.x = 0.0f;
// The source seem overlap is proportionate to the destination seem overlap.
// this is the amount of pixels to overlap each tile as we assemble the ouput image.
float sourceSeemOverlap = (int)((kDestSeemOverlap/destResolution.height)*sourceResolution.height);
CGImageRef sourceTileImageRef;
// calculate the number of read/write operations required to assemble the
// output image.
int iterations = (int)( sourceResolution.height / sourceTile.size.height );
// If tile height doesn't divide the image height evenly, add another iteration
// to account for the remaining pixels.
int remainder = (int)sourceResolution.height % (int)sourceTile.size.height;
if(remainder) {
iterations++;
}
// Add seem overlaps to the tiles, but save the original tile height for y coordinate calculations.
float sourceTileHeightMinusOverlap = sourceTile.size.height;
sourceTile.size.height += sourceSeemOverlap;
destTile.size.height += kDestSeemOverlap;
for( int y = 0; y < iterations; ++y ) {
@autoreleasepool {
sourceTile.origin.y = y * sourceTileHeightMinusOverlap + sourceSeemOverlap;
destTile.origin.y = destResolution.height - (( y + 1 ) * sourceTileHeightMinusOverlap * imageScale + kDestSeemOverlap);
sourceTileImageRef = CGImageCreateWithImageInRect( sourceImageRef, sourceTile );
if( y == iterations - 1 && remainder ) {
float dify = destTile.size.height;
destTile.size.height = CGImageGetHeight( sourceTileImageRef ) * imageScale;
dify -= destTile.size.height;
destTile.origin.y += dify;
}
CGContextDrawImage( destContext, destTile, sourceTileImageRef );
CGImageRelease( sourceTileImageRef );
}
}
CGImageRef destImageRef = CGBitmapContextCreateImage(destContext);
CGContextRelease(destContext);
if (destImageRef == NULL) {
return image;
}
UIImage *destImage = [UIImage imageWithCGImage:destImageRef scale:image.scale orientation:image.imageOrientation];
CGImageRelease(destImageRef);
if (destImage == nil) {
return image;
}
return destImage;
}
}
是否需要將圖片解碼:
1.不解碼動(dòng)圖
2.不解碼帶有alpha的圖片
+ (BOOL)shouldDecodeImage:(nullable UIImage *)image {
// Prevent "CGBitmapContextCreateImage: invalid context 0x0" error
if (image == nil) {
return NO;
}
// do not decode animated images
if (image.images != nil) {
return NO;
}
CGImageRef imageRef = image.CGImage;
CGImageAlphaInfo alpha = CGImageGetAlphaInfo(imageRef);
BOOL anyAlpha = (alpha == kCGImageAlphaFirst ||
alpha == kCGImageAlphaLast ||
alpha == kCGImageAlphaPremultipliedFirst ||
alpha == kCGImageAlphaPremultipliedLast);
// do not decode images with alpha
if (anyAlpha) {
return NO;
}
return YES;
}
是否需要將圖片進(jìn)行scaleDown,如果原圖片總的像素?cái)?shù)的大小 > kDestTotalPixels就需要scaleDown
+ (BOOL)shouldScaleDownImage:(nonnull UIImage *)image {
BOOL shouldScaleDown = YES;
CGImageRef sourceImageRef = image.CGImage;
CGSize sourceResolution = CGSizeZero;
sourceResolution.width = CGImageGetWidth(sourceImageRef);
sourceResolution.height = CGImageGetHeight(sourceImageRef);
float sourceTotalPixels = sourceResolution.width * sourceResolution.height;
float imageScale = kDestTotalPixels / sourceTotalPixels;
if (imageScale < 1) {
shouldScaleDown = YES;
} else {
shouldScaleDown = NO;
}
return shouldScaleDown;
}
獲取圖片的顏色空間军洼,什么是圖片的顏色空間巩螃?請自行百度,
+ (CGColorSpaceRef)colorSpaceForImageRef:(CGImageRef)imageRef {
// current
CGColorSpaceModel imageColorSpaceModel = CGColorSpaceGetModel(CGImageGetColorSpace(imageRef));
CGColorSpaceRef colorspaceRef = CGImageGetColorSpace(imageRef);
BOOL unsupportedColorSpace = (imageColorSpaceModel == kCGColorSpaceModelUnknown ||
imageColorSpaceModel == kCGColorSpaceModelMonochrome ||
imageColorSpaceModel == kCGColorSpaceModelCMYK ||
imageColorSpaceModel == kCGColorSpaceModelIndexed);
if (unsupportedColorSpace) {
colorspaceRef = CGColorSpaceCreateDeviceRGB();
CFAutorelease(colorspaceRef);
}
return colorspaceRef;
}