iOS開(kāi)發(fā)中,攝像頭采集的NV12數(shù)據(jù)縮放代碼實(shí)現(xiàn):
YUV格式解釋?zhuān)介L(zhǎng)(間距)解釋?zhuān)?a href="http://www.reibang.com/p/eace8c08b169" target="_blank">http://www.reibang.com/p/eace8c08b169
攝像頭采集的數(shù)據(jù)CMSampleBufferRef里面的CVImageBufferRef是NV12格式,如果需要對(duì)它進(jìn)行縮放或者添加水印睬棚,需要先轉(zhuǎn)換為I420,對(duì)I420進(jìn)行縮放或者添加水印后,再轉(zhuǎn)換為NV12佩耳,然后再轉(zhuǎn)換為CVImageBufferRef,CVImageBufferRef再轉(zhuǎn)換為CMSampleBufferRef谭跨。
下面是利用libyuv庫(kù)進(jìn)行的轉(zhuǎn)換代碼:
需要特別注意的是開(kāi)辟內(nèi)存空間使用的是步長(zhǎng)干厚,不是寬度,網(wǎng)上很多資料都是:寬度x長(zhǎng)度x1.5 來(lái)計(jì)算螃宙,這種情況必須是要縮放的寬是16(iOS13之前是16蛮瞄,iOS13之后是64)的倍數(shù)。例如886x1920的尺寸縮放成720x1280谆扎。這樣可以直接用寬挂捅,這是因?yàn)?20是16的倍數(shù)。如果是縮放成710x1280堂湖,那就不能直接使用寬闲先,而是需要使用 (710 / 16 + 1) x 16 也就是720來(lái)計(jì)算,具體可以看下面的代碼无蜂。iOS13使用了64位對(duì)齊伺糠,也就是步長(zhǎng)是64的倍數(shù)。而之前的版本使用的是16位對(duì)齊酱讶。
每次轉(zhuǎn)換結(jié)束都會(huì)存儲(chǔ)對(duì)應(yīng)的yuv數(shù)據(jù)退盯,可以使用yuv查看軟件來(lái)檢查是否轉(zhuǎn)換成功彼乌。
yuv查看軟件 百度網(wǎng)盤(pán)下載鏈接: https://pan.baidu.com/s/1A4Vt6NMedOT4ASVxiVmjcg 提取碼: cyvb
1.CMSampleBufferRef縮放到指定尺寸泻肯,并返回CVPixelBufferRef:
/**
* 數(shù)據(jù)流NV12(420f)壓縮
* 數(shù)據(jù)流中數(shù)據(jù)格式是NV12渊迁,但是NV12數(shù)據(jù)限制不能直接處理,故先轉(zhuǎn)換成i420數(shù)據(jù)灶挟,再做縮放
* 然后轉(zhuǎn)回NV12數(shù)據(jù)琉朽,后續(xù)通過(guò)其他函數(shù)轉(zhuǎn)換為數(shù)據(jù)流
* 注意:轉(zhuǎn)換的寬高必須是偶數(shù),如傳入的不是偶數(shù)稚铣,內(nèi)部會(huì)自動(dòng)轉(zhuǎn)為偶數(shù)
*/
+ (CVPixelBufferRef)convertNV12ToI420Scale:(CMSampleBufferRef)sampleBufRef scaleSize:(CGSize)scaleSize {
int scale_width = scaleSize.width;
int scale_hight = scaleSize.height;
// 確保寬高是偶數(shù)
if (scale_width % 2 != 0) {
scale_width++;
}
if (scale_hight % 2 != 0) {
scale_hight++;
}
//CVPixelBufferRef是CVImageBufferRef的別名,兩者操作幾乎一致箱叁。
//獲取CMSampleBuffer的圖像地址
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBufRef);
if (!pixelBuffer) {
return nil;
}
//表示開(kāi)始操作數(shù)據(jù)
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
//圖像寬度(像素)
size_t buffer_width = CVPixelBufferGetWidth(pixelBuffer);
//圖像高度(像素)
size_t buffer_height = CVPixelBufferGetHeight(pixelBuffer);
//獲取CVImageBufferRef中的y數(shù)據(jù)
uint8_t *src_y_frame = (unsigned char *)CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
//獲取CMVImageBufferRef中的uv數(shù)據(jù)
uint8_t *src_uv_frame =(unsigned char *) CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
//y stride
size_t plane1_stride = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0);
//uv stride
size_t plane2_stride = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 1);
//y height
size_t plane1_height = CVPixelBufferGetHeightOfPlane(pixelBuffer, 0);
//uv height
size_t plane2_height = CVPixelBufferGetHeightOfPlane(pixelBuffer, 1);
//y_size
size_t plane1_size = plane1_stride * plane1_height;
//uv_size
size_t plane2_size = plane2_stride * plane2_height;
//yuv_size(內(nèi)存空間)
size_t frame_size = plane1_size + plane2_size;
size_t buffer_u_strate = plane2_stride / 2;
size_t buffer_v_strate = plane2_stride / 2;
// 1.NV12轉(zhuǎn)換為I420
// 開(kāi)辟buffer_frame大小的內(nèi)存空間用于存放轉(zhuǎn)換好的i420數(shù)據(jù)
uint8* buffer_frame = (unsigned char *)malloc(frame_size);
uint8* buffer_u = buffer_frame + plane1_size;
uint8* buffer_v = buffer_u + plane1_size / 4;
libyuv::NV12ToI420(/*const uint8 *src_y*/ src_y_frame,
/*int src_stride_y*/ (int)plane1_stride,
/*const uint8 *src_uv*/ src_uv_frame,
/*int src_stride_uv*/ (int)plane2_stride,
/*uint8 *dst_y*/ buffer_frame,
/*int dst_stride_y*/ (int)plane1_stride,
/*uint8 *dst_u*/ buffer_u,
/*int dst_stride_u*/ (int)buffer_u_strate,
/*uint8 *dst_v*/ buffer_v,
/*int dst_stride_v*/ (int)buffer_v_strate,
/*int width*/ (int)buffer_width,
/*int height*/ (int)buffer_height);
// static NSInteger count = 0;
// count++;
// if (count == 1) {
// NSData *dstData = [NSData dataWithBytes:buffer_frame length:frame_size];
// NSString *dstPath = [NSString stringWithFormat:@"%@%@", NSHomeDirectory(), @"/Documents/i420.yuv"];
// if ([[NSFileManager defaultManager] fileExistsAtPath:dstPath]) {
// [[NSFileManager defaultManager] removeItemAtPath:dstPath error:nil];
// }
// [dstData writeToFile:dstPath atomically:NO];
// NSLog(@"============buffer_size:%@ x %@, strate_width:%@, scale_size:%@ x %@", @(buffer_width), @(buffer_height), @(plane1_stride), @(scale_width), @(scale_hight));
// }
// 2.I420數(shù)據(jù)進(jìn)行相應(yīng)的縮放
int scale_plane1_stride = scale_width;
// 步長(zhǎng)必須是16的倍數(shù),因?yàn)樯婕暗阶止?jié)對(duì)齊惕医,而且iOS13和之前的版本處理方式不一樣耕漱,要注意
int stride_length = 16;
if ([UIDevice currentDevice].systemVersion.floatValue >= 13.0) {
stride_length = 64;
} else {
stride_length = 16;
}
if ((scale_width % stride_length) != 0) {
scale_plane1_stride = (scale_width / stride_length + 1) * stride_length;
}
int scale_plane2_stride = scale_plane1_stride;
int scale_plane1_height = scale_hight;
int scale_plane2_height = scale_hight / 2;
int scale_plane1_size = scale_plane1_stride * scale_plane1_height;
int scale_plane2_size = scale_plane2_stride * scale_plane2_height;
int scale_frame_size = scale_plane1_size + scale_plane2_size;
uint8* scale_buffer = (unsigned char *)malloc(scale_frame_size);
uint8* scale_buffer_u = scale_buffer + scale_plane1_size;
uint8* scale_buffer_v = scale_buffer_u + scale_plane1_size / 4;
libyuv::I420Scale(/*const uint8 *src_y*/ buffer_frame,
/*int src_stride_y*/ (int)plane1_stride,
/*const uint8 *src_u*/ buffer_u,
/*int src_stride_u*/ (int)plane2_stride >> 1,
/*const uint8 *src_v*/ buffer_v,
/*int src_stride_v*/ (int)plane2_stride >> 1,
/*int src_width*/ (int)buffer_width,
/*int src_height*/ (int)buffer_height,
/*uint8 *dst_y*/ scale_buffer,
/*int dst_stride_y*/ scale_plane1_stride,
/*uint8 *dst_u*/ scale_buffer_u,
/*int dst_stride_u*/ scale_plane1_stride >> 1,
/*uint8 *dst_v*/ scale_buffer_v,
/*int dst_stride_v*/ scale_plane1_stride >> 1,
/*int dst_width*/ scale_width,
/*int dst_height*/ scale_hight,
/*enum FilterMode filtering*/ libyuv::kFilterNone
);
// if (count == 1) {
// NSData *dstData = [NSData dataWithBytes:scale_buffer length:scale_frame_size];
// NSString *dstPath = [NSString stringWithFormat:@"%@%@", NSHomeDirectory(), @"/Documents/scalei420.yuv"];
// if ([[NSFileManager defaultManager] fileExistsAtPath:dstPath]) {
// [[NSFileManager defaultManager] removeItemAtPath:dstPath error:nil];
// }
// [dstData writeToFile:dstPath atomically:NO];
// }
// 3.把縮放后的I420數(shù)據(jù)轉(zhuǎn)換為NV12
int nv12_plane1_stride = scale_plane1_stride;
int nv12_width = scale_width;
int nv12_hight = scale_hight;
int nv12_frame_size = scale_frame_size;
uint8 *nv12_dst_y = (uint8 *)malloc(nv12_frame_size);
uint8 *nv12_dst_uv = nv12_dst_y + nv12_plane1_stride * nv12_hight;
libyuv::I420ToNV12(/*const uint8 *src_y*/ scale_buffer,
/*int src_stride_y*/ scale_plane1_stride,
/*const uint8 *src_u*/ scale_buffer_u,
/*int src_stride_u*/ scale_plane1_stride >> 1,
/*const uint8 *src_v*/ scale_buffer_v,
/*int src_stride_v*/ scale_plane1_stride >> 1,
/*uint8 *dst_y*/ nv12_dst_y,
/*int dst_stride_y*/ nv12_plane1_stride,
/*uint8 *dst_uv*/ nv12_dst_uv,
/*int dst_stride_uv*/ nv12_plane1_stride,
/*int width*/ nv12_width,
/*int height*/ nv12_hight);
// if (count == 1) {
// NSData *dstData = [NSData dataWithBytes:nv12_dst_y length:nv12_frame_size];
// NSString *dstPath = [NSString stringWithFormat:@"%@%@", NSHomeDirectory(), @"/Documents/toNv12.yuv"];
// if ([[NSFileManager defaultManager] fileExistsAtPath:dstPath]) {
// [[NSFileManager defaultManager] removeItemAtPath:dstPath error:nil];
// }
// [dstData writeToFile:dstPath atomically:NO];
// }
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
free(buffer_frame);
free(scale_buffer);
// 4.NV12轉(zhuǎn)換為CVPixelBufferRef
NSDictionary *pixelAttributes = @{(id)kCVPixelBufferIOSurfacePropertiesKey : @{}};
CVPixelBufferRef dstPixelBuffer = NULL;
CVReturn result = CVPixelBufferCreate(kCFAllocatorDefault,
nv12_width, nv12_hight, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange,
(__bridge CFDictionaryRef)pixelAttributes, &dstPixelBuffer);
CVPixelBufferLockBaseAddress(dstPixelBuffer, 0);
uint8_t *yDstPlane = (uint8*)CVPixelBufferGetBaseAddressOfPlane(dstPixelBuffer, 0);
memcpy(yDstPlane, nv12_dst_y, nv12_plane1_stride * nv12_hight);
uint8_t *uvDstPlane = (uint8*)CVPixelBufferGetBaseAddressOfPlane(dstPixelBuffer, 1);
memcpy(uvDstPlane, nv12_dst_uv, nv12_plane1_stride * nv12_hight / 2);
if (result != kCVReturnSuccess) {
NSLog(@"Unable to create cvpixelbuffer %d", result);
}
CVPixelBufferUnlockBaseAddress(dstPixelBuffer, 0);
free(nv12_dst_y);
return dstPixelBuffer;
}
2.CVPixelBufferRef轉(zhuǎn)換為CMSampleBufferRef:
// NV12數(shù)據(jù)轉(zhuǎn)換為數(shù)據(jù)流
+ (CMSampleBufferRef)pixelBufferToSampleBuffer:(CVPixelBufferRef)pixelBuffer {
CMSampleBufferRef sampleBuffer;
CMTime frameTime = CMTimeMakeWithSeconds([[NSDate date] timeIntervalSince1970], 1000000000);
CMSampleTimingInfo timing = {frameTime, frameTime, kCMTimeInvalid};
CMVideoFormatDescriptionRef videoInfo = NULL;
CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixelBuffer, &videoInfo);
OSStatus status = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, true, NULL, NULL, videoInfo, &timing, &sampleBuffer);
if (status != noErr) {
NSLog(@"Failed to create sample buffer with error %d.", (int)status);
}
CVPixelBufferRelease(pixelBuffer);
if (videoInfo) {
CFRelease(videoInfo);
}
return sampleBuffer;
}