因為在做直播巷波,最近突然有個想法萎津,自己做做濾鏡試試看,實驗了很多方法抹镊,下面談?wù)勎覐木W(wǎng)上了解的結(jié)合自己的想法锉屈,給大家分享一下。
首先垮耳,說到濾鏡肯定首先想到iOS上著名的開源庫GPUImage颈渊,熟悉GPUImage都知道自定義一個采集端GPUImageVideoCamera再添加濾鏡Target即可實現(xiàn)濾鏡遂黍,但是因為我們直播采用七牛做的,我查看七牛文檔俊嗽,并沒有發(fā)現(xiàn)可以自己定制相機的地方雾家。so,只能考慮對CVPixelBufferRef處理了绍豁。
先上結(jié)果芯咧,采用coreImage處理:
CVPixelBufferRef->CIImage->CIFilter->CIImage->CVPixelBufferRef
_coreImageContext = [CIContext contextWithEAGLContext:self.openGLESContext options:options];
- (CVPixelBufferRef)coreImageHandle:(CVPixelBufferRef)pixelBuffer
{
CFAbsoluteTime elapsedTime, startTime = CFAbsoluteTimeGetCurrent();
CIImage *inputImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
[_coreImageFilter setValue:inputImage forKey:kCIInputImageKey];
CIImage *outputImage = [_coreImageFilter outputImage];
elapsedTime = CFAbsoluteTimeGetCurrent() - startTime;
NSLog(@"Core Image frame time: %f", elapsedTime * 1000.0);
[_coreImageContext render:outputImage toCVPixelBuffer:pixelBuffer];
return pixelBuffer;
}
其中_coreImageContext采用openGL ES 處理,也可變?yōu)镃PU處理妹田,但是考慮性能還是采用GPU處理吧唬党,此時處理時間一張大概1ms,完全可以支撐24幀的畫面
第一條思路:
對CVPixelBufferRef提取紋理鬼佣,扔給GPUImage filter 處理驶拱,然后再拿到處理完的紋理變?yōu)镃VPixelBufferRef,傳給七牛晶衷。因為openGL基礎(chǔ)不足蓝纲,未成功,暫時放棄
自己仿著GPUImageVideoCamera 里面
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
方法將PixelBuffer變?yōu)榧y理晌纫,但是注意七牛代理里面回傳回來的Buffer pixelFormat=BGRA税迷,所以提取紋理CVPixelBufferGetPlaneCount(cameraFrame)只有一層(待續(xù))
第二條思路:
直接處理RGBA,性能消耗太大锹漱,放棄
改方法為轉(zhuǎn)為灰度算法 CPU用量50-80%箭养。太特么燙了
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
CFAbsoluteTime elapsedTime, startTime = CFAbsoluteTimeGetCurrent();
unsigned char *data = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);
size_t bufferHeight = CVPixelBufferGetHeight(pixelBuffer);
size_t bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
NSInteger myDataLength = bufferWidth * bufferHeight * 4;
for (int i = 0; i < myDataLength; i+=4)
{
UInt8 r_pixel = data[i];
UInt8 g_pixel = data[i+1];
UInt8 b_pixel = data[i+2];
//Gray = R*0.299 + G*0.587 + B*0.114
int outputRed = (r_pixel * 0.299) + (g_pixel *0.587) + (b_pixel * 0.114);
int outputGreen = (r_pixel * 0.299) + (g_pixel *0.587) + (b_pixel * 0.114);
int outputBlue = (r_pixel * 0.299) + (g_pixel *0.587) + (b_pixel * 0.114);
if(outputRed>255)outputRed=255;
if(outputGreen>255)outputGreen=255;
if(outputBlue>255)outputBlue=255;
data[i] = outputRed;
data[i+1] = outputGreen;
data[i+2] = outputBlue;
}
elapsedTime = CFAbsoluteTimeGetCurrent() - startTime;
NSLog(@"CPU frame time: %f", elapsedTime * 1000.0);
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
第三條思路:
就是上面結(jié)果那個啦~~