在iOS里泞遗,我們經(jīng)常能看到 CVPixelBufferRef 這個(gè)類型今妄,在Camera 采集返回的數(shù)據(jù)里得到一個(gè)CMSampleBufferRef,而每個(gè)CMSampleBufferRef里則包含一個(gè) CVPixelBufferRef缩宜,在視頻硬解碼的返回?cái)?shù)據(jù)里也是一個(gè) CVPixelBufferRef,它的格式NV12(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange或者kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)這兩種形式褒颈。由于是C對(duì)象,它是不受 ARC 管理的颓帝,就是說(shuō)要開(kāi)發(fā)者自己來(lái)管理引用計(jì)數(shù)米碰,控制對(duì)象的生命周期,可以用CVPixelBufferRetain购城,CVPixelBufferRelease函數(shù)用來(lái)加減引用計(jì)數(shù)吕座,其實(shí)和CFRetain和CFRelease是等效的,所以可以用 CFGetRetainCount來(lái)查看當(dāng)前引用計(jì)數(shù)瘪板。
通過(guò)下面的方法CVImageBufferRef:
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(videoSample);
從CVImageBufferRef 里面獲取yuv數(shù)據(jù),轉(zhuǎn)為yuv420(NV12)
// AWVideoEncoder.m文件
-(NSData *) convertVideoSmapleBufferToYuvData:(CMSampleBufferRef) videoSample{
// 獲取yuv數(shù)據(jù)
// 通過(guò)CMSampleBufferGetImageBuffer方法吴趴,獲得CVImageBufferRef。
// 這里面就包含了yuv420(NV12)數(shù)據(jù)的指針
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(videoSample);
//表示開(kāi)始操作數(shù)據(jù)
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
//圖像寬度(像素)
size_t pixelWidth = CVPixelBufferGetWidth(pixelBuffer);
//圖像高度(像素)
size_t pixelHeight = CVPixelBufferGetHeight(pixelBuffer);
//yuv中的y所占字節(jié)數(shù)
size_t y_size = pixelWidth * pixelHeight;
//yuv中的uv所占的字節(jié)數(shù)
size_t uv_size = y_size / 2;
uint8_t *yuv_frame = aw_alloc(uv_size + y_size);
//獲取CVImageBufferRef中的y數(shù)據(jù)
uint8_t *y_frame = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
memcpy(yuv_frame, y_frame, y_size);
//獲取CMVImageBufferRef中的uv數(shù)據(jù)
uint8_t *uv_frame = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
memcpy(yuv_frame + y_size, uv_frame, uv_size);
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
//返回?cái)?shù)據(jù)
return [NSData dataWithBytesNoCopy:yuv_frame length:y_size + uv_size];
}
顧名思義篷帅,CVPixelBufferRef 是一種像素圖片類型史侣,由于CV開(kāi)頭拴泌,所以它是屬于 CoreVideo 模塊的。
反之惊橱,NV12數(shù)據(jù)可以填充CVPixelBufferRef蚪腐,示例如下所示:
-(CVPixelBufferRef)createCVPixelBufferRefFromNV12buffer:(unsigned char *)buffer width:(int)w height:(int)h {
NSDictionary *pixelAttributes = @{(NSString*)kCVPixelBufferIOSurfacePropertiesKey:@{}};
CVPixelBufferRef pixelBuffer = NULL;
CVReturn result = CVPixelBufferCreate(kCFAllocatorDefault,
w,
h,
kCVPixelFormatType_420YpCbCr8BiPlanarFullRange,
(__bridge CFDictionaryRef)(pixelAttributes),
&pixelBuffer);//kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,
CVPixelBufferLockBaseAddress(pixelBuffer,0);
unsigned char *yDestPlane = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
// Here y_ch0 is Y-Plane of YUV(NV12) data.
unsigned char *y_ch0 = buffer;
memcpy(yDestPlane, y_ch0, w * h);
unsigned char *uvDestPlane = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
// Here y_ch1 is UV-Plane of YUV(NV12) data.
unsigned char *y_ch1 = buffer + w * h;
memcpy(uvDestPlane, y_ch1, w * h/2);
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
if (result != kCVReturnSuccess) {
NSLog(@"Unable to create cvpixelbuffer %d", result);
}
return pixelBuffer;
}
通過(guò)CVPixelBufferGetBaseAddressOfPlane可以得到每個(gè)平面的數(shù)據(jù)指針。在得到 Address之前需要調(diào)用CVPixelBufferLockBaseAddress税朴,這說(shuō)明CVPixelBufferRef的內(nèi)部存儲(chǔ)不僅是內(nèi)存也可能是其它外部存儲(chǔ)回季,比如現(xiàn)存,所以在訪問(wèn)前要lock下來(lái)實(shí)現(xiàn)地址映射正林,同時(shí)lock也保證了沒(méi)有讀寫(xiě)沖突泡一。
在逐行copy數(shù)據(jù)的時(shí)候,pixel內(nèi)部地址每個(gè)循環(huán)步進(jìn) current_row * bytesPerRowChrominance 的大小觅廓,這是pixelbuffer內(nèi)部的內(nèi)存排列鼻忠。然后我的數(shù)據(jù)來(lái)源內(nèi)存排列是緊密排列不考慮內(nèi)存多少位對(duì)齊的問(wèn)題的,所以每次的步進(jìn)是 current_row * _outVideoWidth 也就是真正的視頻幀的寬度杈绸。每次copy的大小也應(yīng)該是真正的寬度帖蔓。對(duì)于這個(gè)通道來(lái)說(shuō),寬度和高度都是亮度通道的一半瞳脓,每個(gè)元素有UV兩個(gè)信息塑娇,所以這個(gè)通道每一行占用空間和亮度通道應(yīng)該是一樣的。也就是每一行copy數(shù)據(jù)的大小是這樣算出來(lái)的:_outVideoWidth / 2 * 2.
UIImage 生成 CVPixelBufferRef
- (CVPixelBufferRef)CVPixelBufferRefFromUiImage:(UIImage *)img
{
CGImageRef image = [img CGImage];
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CGFloat frameWidth = CGImageGetWidth(image);
CGFloat frameHeight = CGImageGetHeight(image);
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
frameWidth,
frameHeight,
kCVPixelFormatType_32ARGB,
(__bridge CFDictionaryRef) options,
&pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata,
frameWidth,
frameHeight,
8,
CVPixelBufferGetBytesPerRow(pxbuffer),
rgbColorSpace,
(CGBitmapInfo)kCGImageAlphaNoneSkipFirst);
NSParameterAssert(context);
CGContextConcatCTM(context, CGAffineTransformIdentity);
CGContextDrawImage(context, CGRectMake(0,
0,
frameWidth,
frameHeight),
image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
CVPixelBufferRef 生成 UIImage
-(UIImage* )createCVPixelBufferRefFromNV12buffer:(unsigned char *)buffer width:(int)w height:(int)h {
NSDictionary *pixelAttributes = @{(NSString*)kCVPixelBufferIOSurfacePropertiesKey:@{}};
CVPixelBufferRef pixelBuffer = NULL;
CVReturn result = CVPixelBufferCreate(kCFAllocatorDefault,
w,
h,
kCVPixelFormatType_420YpCbCr8BiPlanarFullRange,
(__bridge CFDictionaryRef)(pixelAttributes),
&pixelBuffer);//kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,
CVPixelBufferLockBaseAddress(pixelBuffer,0);
unsigned char *yDestPlane = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
// Here y_ch0 is Y-Plane of YUV(NV12) data.
unsigned char *y_ch0 = buffer;
memcpy(yDestPlane, y_ch0, w * h);
unsigned char *uvDestPlane = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
// Here y_ch1 is UV-Plane of YUV(NV12) data.
unsigned char *y_ch1 = buffer + w * h;
memcpy(uvDestPlane, y_ch1, w * h/2);
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
if (result != kCVReturnSuccess) {
NSLog(@"Unable to create cvpixelbuffer %d", result);
}
// CIImage Conversion
CIImage *coreImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIContext *MytemporaryContext = [CIContext contextWithOptions:nil];
CGImageRef MyvideoImage = [MytemporaryContext createCGImage:coreImage
fromRect:CGRectMake(0, 0, w, h)];
// UIImage Conversion
UIImage *Mynnnimage = [[UIImage alloc] initWithCGImage:MyvideoImage
scale:1.0
orientation:UIImageOrientationRight];
CVPixelBufferRelease(pixelBuffer);
CGImageRelease(MyvideoImage);
return Mynnnimage;
}
## CVPixelBufferRef裁剪
![image.png](https://upload-images.jianshu.io/upload_images/1996279-6c1f4c38fb8bf40d.png?imageMogr2/auto-orient/strip%7CimageView2/2/w/1240)
如果使用vimage劫侧,則可以直接處理緩沖區(qū)數(shù)據(jù)埋酬,而無(wú)需將其轉(zhuǎn)換為任何圖像格式。
outImg包含裁剪和縮放的圖像數(shù)據(jù)烧栋。 outWidth和cropWidth之間的關(guān)系設(shè)置縮放写妥。將cropX0 = 0和cropY0 = 0以及cropWidth和cropHeight設(shè)置為原始大小意味著不進(jìn)行裁剪(使用整個(gè)原始圖像)。設(shè)置outWidth = cropWidth和outHeight = cropHeight會(huì)導(dǎo)致無(wú)縮放劲弦。請(qǐng)注意耳标,inBuff.rowBytes應(yīng)始終是完整源緩沖區(qū)的長(zhǎng)度,而不是裁剪長(zhǎng)度邑跪。
int cropX0, cropY0, cropHeight, cropWidth, outWidth, outHeight;
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
vImage_Buffer inBuff;
inBuff.height = cropHeight;
inBuff.width = cropWidth;
inBuff.rowBytes = bytesPerRow;
int startpos = cropY0bytesPerRow+4cropX0;
inBuff.data = baseAddress+startpos;
unsigned char outImg= (unsigned char)malloc(4outWidthoutHeight);
vImage_Buffer outBuff = {outImg, outHeight, outWidth, 4*outWidth};
vImage_Error err = vImageScale_ARGB8888(&inBuff, &outBuff, NULL, 0);
if (err != kvImageNoError) NSLog(@" error %ld", err);