前言
簡單介紹下使用GPUImage以及遇到的問題捷犹,GPUImage下載地址https://github.com/BradLarson/GPUImage
引入GPUImage
引入方式可以通過三種方式伏恐,分別是pod、直接引入工程和打成靜態(tài)庫(.a)文件引入横蜒。
-
通過pod引用
直接在podfile里面添加GPUimage销凑,比較簡單直接 -
導入GPUImage工程
這種方式比較繁瑣斗幼,需要額外配置工程,具體可參考如何正確的導入項目 -
靜態(tài)庫引入
直接運用下載下來的工程GPUImage.xcodeproj文件就可以生成.a文件谋逻,需要注意的是它分為模擬器和真機桐经,還需要區(qū)分debug和release模式阴挣,在生成.a是選擇Generic iOS Device生成包含多個指令集的靜態(tài)庫文件。
修改GPUImage文件
因為GPUImageView中使用的異步中獲取了尺寸茎芭,而這個只能在主線程執(zhí)行梅桩,所以第一次啟動時會等待一會拜隧,需要對此做一下處理虹蓄。
申明一個viewBounds變量幸撕,在initWithFrame的時候賦值:
- (void)recalculateViewGeometry;
{
runSynchronouslyOnVideoProcessingQueue(^{
CGFloat heightScaling, widthScaling;
CGSize currentViewSize = self.viewBounds.size;
// CGFloat imageAspectRatio = inputImageSize.width / inputImageSize.height;
// CGFloat viewAspectRatio = currentViewSize.width / currentViewSize.height;
CGRect insetRect = AVMakeRectWithAspectRatioInsideRect(inputImageSize, self.viewBounds);
switch(_fillMode)
{
case kGPUImageFillModeStretch:
{
widthScaling = 1.0;
heightScaling = 1.0;
}; break;
case kGPUImageFillModePreserveAspectRatio:
{
widthScaling = insetRect.size.width / currentViewSize.width;
heightScaling = insetRect.size.height / currentViewSize.height;
}; break;
case kGPUImageFillModePreserveAspectRatioAndFill:
{
// CGFloat widthHolder = insetRect.size.width / currentViewSize.width;
widthScaling = currentViewSize.height / insetRect.size.height;
heightScaling = currentViewSize.width / insetRect.size.width;
}; break;
}
imageVertices[0] = -widthScaling;
imageVertices[1] = -heightScaling;
imageVertices[2] = widthScaling;
imageVertices[3] = -heightScaling;
imageVertices[4] = -widthScaling;
imageVertices[5] = heightScaling;
imageVertices[6] = widthScaling;
imageVertices[7] = heightScaling;
});
}
需要注意如果在錄制后第一幀黑屏律胀,先確定在Build sSetting里面,Other Linker Flags 里面添加 -fobjc-arc -ObjC這兩項罪佳,若還是會黑屏黑低,可參考http://www.reibang.com/p/c218651cc461
開啟視頻
采集可以設置選擇前后置攝像頭克握,設置鏡像模式,GPUImageView的fillMode為kGPUImageFillModePreserveAspectRatioAndFill則全屏展示掰曾,對于iPhoneX不是16:9的樣式停团,可設置針對于iPhoneX的樣式佑稠。
self.videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1280x720 cameraPosition:AVCaptureDevicePositionBack];
//Alan 為了在AVCaptureDevice上設置硬件屬性,比如focusMode和exposureMode番电,客戶端必須首先獲取設備上的鎖漱办。
if ([_videoCamera.inputCamera lockForConfiguration:nil]) {
if ([_videoCamera.inputCamera isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus]) {//自動對焦
[_videoCamera.inputCamera setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
}
if ([_videoCamera.inputCamera isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]) {//自動曝光
[_videoCamera.inputCamera setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
}
if ([_videoCamera.inputCamera isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance]) {//自動白平衡
[_videoCamera.inputCamera setWhiteBalanceMode:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance];
}
[_videoCamera.inputCamera unlockForConfiguration];//Alan 解鎖設備娩井,提交配置
}
_videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;//豎屏方向采集數(shù)據(jù)
[_videoCamera addAudioInputsAndOutputs];//錄制的時候添加聲音,添加輸入源和輸出源會暫時會使錄制暫時卡住,所以在要使用聲音的情況下要先調(diào)用該方法來防止錄制被卡住似袁。
_videoCamera.horizontallyMirrorFrontFacingCamera = YES;//前置是鏡像
_videoCamera.horizontallyMirrorRearFacingCamera = NO;//后置不是鏡像
_videoCamera.frameRate = 30;
[_videoCamera addTarget:self.beautyFilter];
self.filterView = [[GPUImageView alloc]initWithFrame:self.view.bounds];
_filterView.fillMode = kGPUImageFillModePreserveAspectRatioAndFill;
[self.beautyFilter addTarget:_filterView];
[self.view addSubview:_filterView];
[_videoCamera startCameraCapture];//開啟攝像頭
上面采集視頻默認加了美顏濾鏡昙衅,具體實現(xiàn)可以參考http://www.reibang.com/p/6bdb4cb50f14
- (GPUImageBeautifyFilter *)beautyFilter {
if(!_beautyFilter) {
_beautyFilter = [[GPUImageBeautifyFilter alloc]init];
}
return _beautyFilter;
}
切換濾鏡
在切換濾鏡前需要移除它之前的target,再重新添加
[self.videoCamera removeAllTargets];
[self.filterGroup removeAllTargets];
[self.beautyFilter removeAllTargets];
if(filterType == CameraVideoFilterNone) { //不需要濾鏡
[self.videoCamera addTarget:self.beautyFilter];
[self.beautyFilter addTarget:_filterView];
return ;
}
[self.filterGroup setInitialFilters:@[imageFilter]];
[self.filterGroup setTerminalFilter:imageFilter];
[self.videoCamera addTarget:self.beautyFilter];
[self.beautyFilter addTarget:self.filterGroup];
[self.filterGroup addTarget:_filterView];
- (GPUImageFilterGroup *)filterGroup {
if(!_filterGroup) {
_filterGroup = [[GPUImageFilterGroup alloc]init];
}
return _filterGroup;
}
采集視頻
選擇濾鏡之后可以開始錄制,錄制完需要移除而涉,另外再次開啟時需要重新初始化MovieWriter著瓶。
- (void)videoStartRecording {
NSString *pathToMovie = [NSString stringWithFormat:@"%@/camera_video.mp4",[NSString stringWithFormat:@"%@",[NSHomeDirectory() stringByAppendingPathComponent:@"tmp"]]];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
self.movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(720.0, 1280.0)];
_movieWriter.encodingLiveVideo = YES;
[_movieWriter setCompletionBlock:^{
}];
[_movieWriter setFailureBlock:^(NSError *error) {
}];
if(self.selectdFilterType == CameraVideoFilterNone) {
[self.beautyFilter addTarget:_movieWriter];
}else {
[self.filterGroup addTarget:_movieWriter];
}
_videoCamera.audioEncodingTarget = _movieWriter;
[_movieWriter startRecording];
}
- (void)videoFinishRecoring {
if(self.selectdFilterType == CameraVideoFilterNone) {
[self.beautyFilter removeTarget:_movieWriter];
}else {
[self.filterGroup removeTarget:_movieWriter];
}
_videoCamera.audioEncodingTarget = nil;
[_movieWriter finishRecording];
}
本地視頻添加濾鏡
本地視頻添加濾鏡,可參考GPUImage-Master里面的demo啼县,這邊需要注意的是視頻旋轉(zhuǎn)問題材原。
我們在展示本地視頻和添加濾鏡寫入的時候沸久,都需要處理視頻旋轉(zhuǎn)。
如下展示本地添加濾鏡的視頻:
//視頻旋轉(zhuǎn)角度
- (NSUInteger)degressFromVideoFileWithURL:(NSURL *)url {
NSUInteger degress = 0;
AVAsset *asset = [AVAsset assetWithURL:url];
NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if([tracks count] > 0) {
AVAssetTrack *videoTrack = [tracks objectAtIndex:0];
CGAffineTransform t = videoTrack.preferredTransform;
if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0){
// Portrait
degress = 90;
}else if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0){
// PortraitUpsideDown
degress = 270;
}else if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0){
// LandscapeRight
degress = 0;
}else if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0){
// LandscapeLeft
degress = 180;
}
}
return degress;
}
- (void)setupVideo {
NSURL *videoPathUrl = [NSURL fileURLWithPath:self.videoPath];//本地視頻路徑
self.movieFile = [[GPUImageMovie alloc] initWithURL:videoPathUrl];
_movieFile.runBenchmark = YES;
_movieFile.playAtActualSpeed = YES;
_movieFile.shouldRepeat = YES;
[_movieFile addTarget:self.beautyFilter];//添加濾鏡
self.filterView = [[GPUImageView alloc]initWithFrame:self.view.bounds];
_filterView.fillMode = kGPUImageFillModePreserveAspectRatio;
[self adjustVideoDegressWithUrl:videoPathUrl];//調(diào)整角度
[self.beautyFilter addTarget:_filterView];
[self.view addSubview:_filterView];
[_movieFile startProcessing];
}
//調(diào)整視頻的角度
- (void)adjustVideoDegressWithUrl:(NSURL *)url {
NSInteger degress = [self degressFromVideoFileWithURL:url];
switch (degress) {
case 90:
[_filterView setInputRotation:kGPUImageRotateRight atIndex:0];
self.videoDegress = 90;
break;
case 180:
[_filterView setInputRotation:kGPUImageRotate180 atIndex:0];
self.videoDegress = 180;
break;
case 270:
[_filterView setInputRotation:kGPUImageRotateLeft atIndex:0];
self.videoDegress = 270;
break;
default:
break;
}
}
這樣就可以正常顯示本地視頻了余蟹,切換濾鏡和存儲和拍攝類似。需要注意的是添加完濾鏡儲存的時候也需要制定旋轉(zhuǎn)角度威酒,
[kmovieFile enableSynchronizedEncodingUsingMovieWriter:kmovieWriter];
[kmovieWriter startRecordingInOrientation:CGAffineTransformMakeRotation(self.videoDegress/180.0*M_PI)];
[kmovieFile startProcessing];
視頻壓縮
視頻壓縮可以針對分辨率和碼率來進行壓縮窑睁,壓縮分辨率,可以直接指定分辨率大小兼搏。
針對碼率壓縮卵慰,我們可以用https://github.com/rs/SDAVAssetExportSession
詳細介紹可參考wheelsMaker的iOS視頻壓縮筆記
在使用的時候需要注意視頻角度導致的問題
SDAVAssetExportSession *encoder = [[SDAVAssetExportSession alloc]initWithAsset:avAsset];
encoder.outputFileType = AVFileTypeMPEG4;
encoder.outputURL = [NSURL fileURLWithPath:zipPath];
CGFloat videoWidth = 720.0;
CGFloat videoHeight = 1280.0;
NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoTrack;
if([tracks count] > 0) {
videoTrack = [tracks objectAtIndex:0];
videoWidth = videoTrack.naturalSize.width;
videoHeight = videoTrack.naturalSize.height;
}
if(videoTrack && [NBCameraVideoTools degressFromVideoFileWithURL:[NSURL fileURLWithPath:originalPath]]%180 != 0) {
videoWidth = videoTrack.naturalSize.height;
videoHeight = videoTrack.naturalSize.width;
}
encoder.videoSettings = @{AVVideoCodecKey: AVVideoCodecH264,AVVideoWidthKey: @(videoWidth),AVVideoHeightKey: @(videoHeight),AVVideoCompressionPropertiesKey: @{AVVideoAverageBitRateKey: @1000000,AVVideoProfileLevelKey: AVVideoProfileLevelH264High40}};
encoder.audioSettings = @{AVFormatIDKey: @(kAudioFormatMPEG4AAC),AVNumberOfChannelsKey: @2,AVSampleRateKey: @44100,AVEncoderBitRateKey: @128000};
[encoder exportAsynchronouslyWithCompletionHandler:^{
if (encoder.status == AVAssetExportSessionStatusCompleted){
NSLog(@"Video export succeeded");
}else if (encoder.status == AVAssetExportSessionStatusCancelled){
NSLog(@"Video export cancelled");
}else{
NSLog(@"Video export failed with error: %@ (%d)", encoder.error.localizedDescription, encoder.error.code);
}
dispatch_async(dispatch_get_main_queue(), ^{
showTipsInCenter([NSString stringWithFormat:@"處理完成,狀態(tài):%d",[encoder status]]);
});
}];
其他
還有視頻水印佛呻、合成就不一一介紹了裳朋,網(wǎng)上有很多方法,也都沒啥問題吓著。再使用GPUImage的時候鲤嫡,最好先去運用它的demo,看下它的一些濾鏡實現(xiàn)使用方法绑莺。
后續(xù)遇到的問題
1.[GPUImageMovie endProcessing]的野指針異常暖眼,概率比較低
出現(xiàn)原因為movie已經(jīng)置空,write后置空纺裁,而且在不同的線程诫肠,所以寫入時可能異常。
解決方案參考:數(shù)數(shù)GPUImage里那些未知的坑
2.SEGV_ACCERR [GPUImageContext presentBufferForDisplay]異常
這個問題是鎖屏后再進入出現(xiàn)
原因是iOS不支持appWillResignActive進行OpenGL渲染,所以切后臺之前要調(diào)用glfinish()欺缘,將緩沖區(qū)中的指令(無論是否為滿)立刻送給圖形硬件執(zhí)行恐似,glfinish()會等待圖形硬件執(zhí)行完才返回蝇棉。
需要注意的是:在iOS11這個方法可能執(zhí)行兩遍捅暴。
解決方法為:
runSynchronouslyOnVideoProcessingQueue(^{
glFinish();
});
可參考:GPUImage presentBufferForDisplay崩潰問題
參考:
iOS開發(fā)-美顏相機仇参、短視頻(GPUImage的使用)
https://github.com/rs/SDAVAssetExportSession
iOS視頻壓縮筆記
如何正確的導入項目
iOS設備閃光燈的使用
源碼級別對GPUImage進行剖析 以及 嘗試
數(shù)數(shù)GPUImage里那些未知的坑
GPUImage presentBufferForDisplay崩潰問題