隨筆
寫這個之前網絡上找了好多辦法都沒有成功解決
使用本地錄制功能錄制視頻沒問題
使用轉碼推流發(fā)送后顯示丟失幀目測是有一部分幀圖片未進行美顏濾鏡效果處理就進行了發(fā)送剂癌,所以出現閃爍的狀況
猜測問題出現的原因:
1. 本地錄制使用的獲取錄制后的元數據和直播獲取途徑不一樣
@property (nonatomic, strong) GPUImageOutput<GPUImageInput> *filter;
@property (nonatomic, strong) GPUImageOutput<GPUImageInput> *output;
@property (nonatomic, strong) GPUImageAlphaBlendFilter *blendFilter;
@property (nonatomic, strong) GPUImageUIElement *uiElementInput;
-------------添加全局-------------------
@property (nonatomic, strong) GPUImagePixelBufferOutput *gpuOutput;
---------------------------------------
/// 視頻采集獲取關系鏈如下所示:
/*
要理解它的實現原理堡妒,需要搞懂GPUImageUIElement和GPUImageAlphaBlendFilter媳谁。
GPUImageUIElement的作用是把一個視圖的layer通過CALayer的renderInContext:方法把layer轉化為image添寺,
然后作為OpenGL的紋理傳給GPUImageAlphaBlendFilter婆誓。
而GPUImageAlphaBlendFilter則是一個兩輸入的blend filter,
第一個輸入是攝像頭數據吭露,
第二個輸入是剛剛提到的GPUImageUIElement的數據捞蛋,
GPUImageAlphaBlendFilter將這兩個輸入做alpha blend孝冒,可以簡單的理解為將第二個輸入疊加到第一個的上面,
*/
/*
雙重濾鏡疊加效果(并聯)
fileter濾鏡->blendFilter濾鏡->gpuImageView展示
uiElementInput->blendFilter濾鏡->gpuImageView展示//
uiElementInput: 只有初始化的水印圖片拟杉,注釋 [self.filter addTarget:self.blendFilter] 庄涡,屏幕上只有水印錄像沒有實時影像
filter: 實時影像
*/
[self.filter addTarget:self.blendFilter];
[self.uiElementInput addTarget:self.blendFilter];
[self.blendFilter addTarget:self.gpuImageView];
-------------------------------------------------
| [self.blendFilter addTarget:self.gpuOutput]; |
-------------------------------------------------
if(self.saveLocalVideo) [self.blendFilter addTarget:self.movieWriter];
[self.filter addTarget:self.output];
-------------------------------------------------
| [self.filter addTarget:self.gpuOutput]; |
-------------------------------------------------
[self.uiElementInput update];
以上圈出來的是需要加去的方法解決閃爍的水印丟失的方法
/// 原來對美顏數據的獲取方法
[self.output setFrameProcessingCompletionBlock:^(GPUImageOutput *output, CMTime time) {
__strong __typeof(weakSelf)strongSelf = weakSelf;
@autoreleasepool {
GPUImageFramebuffer *imageFramebuffer = output.framebufferForOutput;
GPUImageFramebuffer *imageFramebuffer = output.framebufferForOutput;
CVPixelBufferRef pixelBuffer = [imageFramebuffer pixelBuffer];
if (pixelBuffer && _self.delegate && [_self.delegate respondsToSelector:@selector(captureOutput:pixelBuffer:isBeauty:)]) {
[strongSelf.delegate captureOutput:strongSelf pixelBuffer:pixelBufferRef isBeauty:strongSelf.beautyFace frameTime: time];
}
}
}];
原始的數據獲取就出現了推流后的閃爍問題
加入自定義方法后,獲取處理后數據的方法如下搬设;
__weak typeof(self) weakSelf = self;
_gpuOutput.pixelBufferCallback = ^(CVPixelBufferRef _Nullable pixelBufferRef, CMTime frameTime) {
__strong __typeof(weakSelf)strongSelf = weakSelf;
if (pixelBufferRef && strongSelf.delegate && [strongSelf.delegate respondsToSelector:@selector(captureOutput:pixelBuffer:isBeauty:frameTime:)]) {
[strongSelf.delegate captureOutput:strongSelf pixelBuffer:pixelBufferRef isBeauty:strongSelf.beautyFace frameTime:frameTime];
}
};
2. 錄制方法獲取可自行下載源碼研究啼染,此處貼一部分做記錄
/// GPUImageMovieWriter 錄制方法
@interface GPUImageMovieWriter : NSObject <GPUImageInput>
#pragma mark GPUImageInput protocol
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
{
......
/// 方法內部實現
}
3. 我們自定義一個GPUImagePixelBufferOutput管理我們的美顏數據,模仿錄制GPUImageMovieWriter方法進行處理輸出
@interface GPUImagePixelBufferOutput : GPUImageRawDataOutput <GPUImageInput>繼承自GPUImageRawDataOutput <GPUImageInput>
對數據更加容易處理
源碼文件:
GPUImagePixelBufferOutput.h
#import <Foundation/Foundation.h>
#import <UIKit/UIKit.h>
#if __has_include(<GPUImage/GPUImageFramework.h>)
#import <GPUImage/GPUImageRawDataOutput.h>
#else
#import "GPUImageRawDataOutput.h"
#endif
typedef void (^GPUImageBufferOutputBlock) (CVPixelBufferRef _Nullable pixelBufferRef, CMTime frameTime);
NS_ASSUME_NONNULL_BEGIN
@interface GPUImagePixelBufferOutput : GPUImageRawDataOutput <GPUImageInput>
@property(nonatomic, copy)GPUImageBufferOutputBlock pixelBufferCallback;
- (instancetype)initwithImageSize:(CGSize)newImageSize;
@end
NS_ASSUME_NONNULL_END
GPUImagePixelBufferOutput.m
此處有自己項目的自定義需求所以保留了相機采集時的時間戳數據
使用
#pragma mark - GPUImageInput protocol
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;方法進行數據自定義處理焕梅,效果是繼承自GPUImageRawDataOutput 是數據鏈路處理過程的數據輸出管理迹鹅,可以獲得整個視頻流處理后的數據,在這個里面進行復寫操作實現自己的業(yè)務邏輯
#import "GPUImagePixelBufferOutput.h"
@implementation GPUImagePixelBufferOutput
- (instancetype)initwithImageSize:(CGSize)newImageSize{
if (self == [super initWithImageSize:newImageSize resultsInBGRAFormat:YES]) {
}
return self;
}
#pragma mark - GPUImageInput protocol
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
{
[super newFrameReadyAtTime:frameTime atIndex:textureIndex];
// Float64 time = CMTimeGetSeconds(frameTime);
// NSLog(@"采集時間贞言?斜棚??该窗? %f",time);//采集時間弟蚀??酗失?义钉? 1917.957380
[self lockFramebufferForReading];
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
// NSDictionary *options = @{(id)kCVPixelBufferIOSurfacePropertiesKey : @{}};
CVPixelBufferRef pixelBuffer = NULL;
CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
imageSize.width,
imageSize.height,
kCVPixelFormatType_32BGRA,
self.rawBytesForImage,
self.bytesPerRowInOutput,
NULL,
NULL,
(__bridge CFDictionaryRef)options,
&pixelBuffer);
if(self.pixelBufferCallback){
self.pixelBufferCallback(pixelBuffer,frameTime);
}
CVPixelBufferRelease(pixelBuffer);
[self unlockFramebufferAfterReading];
}
- (void)willOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer {}
- (void)processAudioBuffer:(CMSampleBufferRef)audioBuffer {}
- (BOOL)hasAudioTrack {return YES;}
@end