ReplayKit2 屏幕錄制
如果你需要錄制蘋果手機(jī)屏幕,ReplayKit肯定需要了解阔馋。本文主要介紹Replaykit2 在iOS12后的一些技巧及使用方法。為啥不介紹iOS12前的錄制呢排嫌,因為操作起來太麻煩了芯勘,麻煩到開發(fā)的我們使用起來都很不順手,別說用戶使用装蓬,而且現(xiàn)在蘋果都iOS14了...
ReplayKit2 使用的技巧
-
由于系統(tǒng)提供的是個
RPSystemBroadcastPickerView
類型的View
著拭,需要用戶點擊這個View
才能彈出錄制界面。那如何才能優(yōu)雅的把它給隱藏呢牍帚?答案是我們在它上面覆蓋一層View
儡遮,然后把點擊事件傳遞給它達(dá)到點擊效果。事件類型根據(jù)不同系統(tǒng)版本會稍有不同,直接貼代碼:@property (nonatomic, strong) RPSystemBroadcastPickerView *sysTemBroadCastPickerView; //錄制view @property (nonatomic, strong) UIButton *startPushStreamBtn; //開始錄制按鈕 - (void)showReplayKitView { if (@available(iOS 12.0, *)) { for (UIView *view in _sysTemBroadCastPickerView.subviews) { if ([view isKindOfClass:[UIButton class]]) { float iOSVersion = [[UIDevice currentDevice].systemVersion floatValue]; UIButton *button = (UIButton *)view; if (button != self.startPushStreamBtn) { if (iOSVersion >= 13) { [(UIButton *)view sendActionsForControlEvents:UIControlEventTouchDown]; [(UIButton *)view sendActionsForControlEvents:UIControlEventTouchUpInside]; } else { [(UIButton *)view sendActionsForControlEvents:UIControlEventTouchDown]; } } } } } }
-
如何指定錄制應(yīng)用暗赶,是否使用麥克風(fēng)鄙币。注意
BundleID
為錄制Target
的BundleID
,和主工程的BundleID
區(qū)分下蹂随。如果不填寫十嘿,彈窗的錄制界面將會帶上手機(jī)上所有支持錄制的應(yīng)用,是不是很不友好..- (RPSystemBroadcastPickerView *)sysTemBroadCastPickerView API_AVAILABLE(ios(12.0)) { if (!_sysTemBroadCastPickerView) { _sysTemBroadCastPickerView = [[RPSystemBroadcastPickerView alloc] init]; _sysTemBroadCastPickerView.showsMicrophoneButton = NO;//是否顯示麥克風(fēng) _sysTemBroadCastPickerView.preferredExtension = [MnaConfig replayKitBundleID];//指定錄制應(yīng)用BundleID } return _sysTemBroadCastPickerView; }
-
點擊開始直播后有個倒計時岳锁,倒計時結(jié)束后如何優(yōu)雅的退出錄制界面绩衷?首先我們需要捕獲系統(tǒng)錄制彈窗,看彈出效果應(yīng)該是
presentViewController
然后采用Method Swizzling
嘗試下,發(fā)現(xiàn)可以拿到唇聘。然后我們可以在錄制進(jìn)程啟動后發(fā)送進(jìn)程通知版姑,主進(jìn)程收到后進(jìn)行dismiss
:代碼如下:@implementation UIViewController (MnaPresentSwizzleAdd) + (void)load { static dispatch_once_t onceToken; dispatch_once(&onceToken, ^{ [self swizzleSelector:@selector(presentViewController:animated:completion:) withAnotherSelector:@selector(mna_presentViewController:animated:completion:)]; }); } + (void)swizzleSelector:(SEL)originalSelector withAnotherSelector:(SEL)swizzledSelector { Class aClass = [self class]; Method originalMethod = class_getInstanceMethod(aClass, originalSelector); Method swizzledMethod = class_getInstanceMethod(aClass, swizzledSelector); BOOL didAddMethod = class_addMethod(aClass, originalSelector, method_getImplementation(swizzledMethod), method_getTypeEncoding(swizzledMethod)); if (didAddMethod) { class_replaceMethod(aClass, swizzledSelector, method_getImplementation(originalMethod), method_getTypeEncoding(originalMethod)); } else { method_exchangeImplementations(originalMethod, swizzledMethod); } } #pragma mark - Method Swizzling - (void)mna_presentViewController:(UIViewController *)viewControllerToPresent animated:(BOOL)flag completion:(void (^)(void))completion { if ([NSStringFromClass(viewControllerToPresent.class) isEqualToString:@"RPBroadcastPickerStandaloneViewController"]) { MnaReplayKitHiddenManager.sharedInstance.replayKitBraodViewControler = viewControllerToPresent; //該管理類監(jiān)聽錄制進(jìn)程啟動完成的通知然后進(jìn)行Dismiss [self mna_presentViewController:viewControllerToPresent animated:flag completion:completion]; } else { [self mna_presentViewController:viewControllerToPresent animated:flag completion:completion]; } } @end
更新:iOS14后彈出方法更改柱搜,暫時還不知道如何顯示迟郎,該方法只適用iOS14以下
進(jìn)程通知,進(jìn)程間通訊非常麻煩聪蘸∠苄ぃ可以采用
CFNotificationCenterPostNotification
進(jìn)行消息傳遞,不過有個問題是不能傳遞數(shù)據(jù)健爬。不過我們可以使用App Groups
可以數(shù)據(jù)共享的方式來進(jìn)行傳遞控乾。推薦一個已經(jīng)封裝好的開源庫 MMWormhole
踩過的坑
Extension
進(jìn)程最麻煩的就是調(diào)試。如果要查看Log
娜遵,錄制調(diào)試可以選擇 錄制進(jìn)程->運行->選擇主進(jìn)程蜕衡。這樣啟動后可以在終端看到Log
,不過以前同事遇到過某些Xcode
版本不能運行设拟,并看不了Log
慨仿。測試可行的Xcode
版本:Version 12.1 (12A7403)
-
如果你需要打印一些日志保存,進(jìn)行問題定位,需要保存到
Group
共享區(qū)纳胧。獲取方式:[[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier:goupName];
內(nèi)存限制
50M
這個非常重要镰吆,因為如果超過這個大小直接被系統(tǒng)殺掉。如果你做推流:必須控制緩沖隊列大小,做好內(nèi)存管理跑慕。
-
音頻輸出為大端万皿,如果需要,可以轉(zhuǎn)換下大小端核行。轉(zhuǎn)換為小端代碼如下:
- (NSData *)convertAudioSamepleBufferToPcmData:(CMSampleBufferRef)sampleBuffer { CMBlockBufferRef blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer); if (blockBuffer == nil) { return nil; } AudioBufferList bufferList; CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, NULL, &bufferList, sizeof(bufferList), NULL, NULL, kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment, &blockBuffer); int8_t *audioBuffer = (int8_t *)bufferList.mBuffers[0].mData; UInt32 audioBufferSizeInBytes = bufferList.mBuffers[0].mDataByteSize; CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer); const AudioStreamBasicDescription *asbd = CMAudioFormatDescriptionGetStreamBasicDescription(formatDescription); // Perform an endianess conversion, if needed. A TVIAudioDevice should deliver little endian samples. if (asbd->mFormatFlags & kAudioFormatFlagIsBigEndian) { //大端 for (int i = 0; i < (audioBufferSizeInBytes - 1); i += 2) { int8_t temp = audioBuffer[i]; audioBuffer[i] = audioBuffer[i + 1]; audioBuffer[i + 1] = temp; } } else { //小端 } NSData *data = [NSData dataWithBytes:audioBuffer length:audioBufferSizeInBytes]; CFRelease(blockBuffer); return data; }
-
不管你手機(jī)豎屏還是橫屏牢硅,非常無語的是視頻輸出,都是豎屏芝雪。還好
iOS11
后有個方法可以判定當(dāng)前輸出的視頻幀方向:CGImagePropertyOrientation oritation = ((__bridge NSNumber *)CMGetAttachment(buffer, (__bridge CFStringRef)RPVideoSampleOrientationKey, NULL)).unsignedIntValue; typedef CF_CLOSED_ENUM(uint32_t, CGImagePropertyOrientation) { kCGImagePropertyOrientationUp = 1, // 0th row at top, 0th column on left - default orientation kCGImagePropertyOrientationUpMirrored, // 0th row at top, 0th column on right - horizontal flip kCGImagePropertyOrientationDown, // 0th row at bottom, 0th column on right - 180 deg rotation kCGImagePropertyOrientationDownMirrored, // 0th row at bottom, 0th column on left - vertical flip kCGImagePropertyOrientationLeftMirrored, // 0th row on left, 0th column at top kCGImagePropertyOrientationRight, // 0th row on right, 0th column at top - 90 deg CW kCGImagePropertyOrientationRightMirrored, // 0th row on right, 0th column on bottom kCGImagePropertyOrientationLeft // 0th row on left, 0th column at bottom - 90 deg CCW };
-
由于視頻都是豎屏輸出唤衫,所以需要旋轉(zhuǎn)方向然后再轉(zhuǎn)換為
CVPixelBufferRef
進(jìn)行硬編碼。找過很多資料绵脯,這塊介紹很少佳励。有介紹使用開源庫libyuv
不過是轉(zhuǎn)換為i420
后調(diào)用的騰訊云接口無相關(guān)涉及。現(xiàn)在使用的是方法:#pragma mark - Rotation default stream - (void)dealWithSampleBuffer:(CMSampleBufferRef)buffer timeStamp:(uint64_t)timeStamp { if (@available(iOS 11.0, *)) { CGImagePropertyOrientation oritation = ((__bridge NSNumber *)CMGetAttachment(buffer, (__bridge CFStringRef)RPVideoSampleOrientationKey, NULL)).unsignedIntValue; CIImage *outputImage = nil; CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer); CGFloat outputWidth = self.session.videoConfiguration.videoSize.width; CGFloat outputHeight = self.session.videoConfiguration.videoSize.height; BOOL isLandScape = self.session.videoConfiguration.landscape; size_t inputWidth = CVPixelBufferGetWidth(pixelBuffer); size_t inputHeight = CVPixelBufferGetHeight(pixelBuffer); CGAffineTransform lastRotateTransform = CGAffineTransformMakeScale(0.5, 0.5); CIImage *sourceImage = nil; CGImagePropertyOrientation lastRotateOritation = oritation; sourceImage = [CIImage imageWithCVPixelBuffer:pixelBuffer]; // 如果是橫屏且輸入源為橫屏(iPad Pro)或者 豎屏且輸入源為豎屏 if ((inputWidth > inputHeight && isLandScape) || (inputWidth <= inputHeight && !isLandScape)) { if (oritation == kCGImagePropertyOrientationUp) { lastRotateOritation = kCGImagePropertyOrientationUp; } else if (oritation == kCGImagePropertyOrientationDown) { lastRotateOritation = kCGImagePropertyOrientationDown; } lastRotateTransform = CGAffineTransformMakeScale(outputWidth / inputWidth, outputHeight / inputHeight); } else { if (oritation == kCGImagePropertyOrientationLeft) { lastRotateOritation = kCGImagePropertyOrientationRight; } else if (oritation == kCGImagePropertyOrientationRight) { lastRotateOritation = kCGImagePropertyOrientationLeft; } else { lastRotateOritation = kCGImagePropertyOrientationLeft; } lastRotateTransform = CGAffineTransformMakeScale(outputWidth / inputHeight, outputHeight / inputWidth); } sourceImage = [sourceImage imageByApplyingCGOrientation:lastRotateOritation]; outputImage = [sourceImage imageByApplyingTransform:lastRotateTransform]; if (outputImage) { NSDictionary *pixelBufferOptions = @{(NSString *)kCVPixelBufferWidthKey : @(outputWidth), (NSString *)kCVPixelBufferHeightKey : @(outputHeight), (NSString *)kCVPixelBufferOpenGLESCompatibilityKey : @YES, (NSString *)kCVPixelBufferIOSurfacePropertiesKey : @{} }; CVPixelBufferLockBaseAddress(pixelBuffer, 0); CVPixelBufferRef newPixcelBuffer = nil; CVPixelBufferCreate(kCFAllocatorDefault, outputWidth, outputHeight, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, (__bridge CFDictionaryRef)pixelBufferOptions, &newPixcelBuffer); [_ciContext render:outputImage toCVPixelBuffer:newPixcelBuffer]; CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); CMVideoFormatDescriptionRef videoInfo = nil; CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, newPixcelBuffer, &videoInfo); CMTime duration = CMSampleBufferGetDuration(buffer); CMTime presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(buffer); CMTime decodeTimeStamp = CMSampleBufferGetDecodeTimeStamp(buffer); CMSampleTimingInfo sampleTimingInfo; sampleTimingInfo.duration = duration; sampleTimingInfo.presentationTimeStamp = presentationTimeStamp; sampleTimingInfo.decodeTimeStamp = decodeTimeStamp; // CMSampleBufferRef newSampleBuffer = nil; CMSampleBufferCreateForImageBuffer(kCFAllocatorMalloc, newPixcelBuffer, true, nil, nil, videoInfo, &sampleTimingInfo, &newSampleBuffer); // 對新buffer做處理 [self.session pushVideoBuffer:newSampleBuffer timeStamp:timeStamp]; // release if (newPixcelBuffer) { CVPixelBufferRelease(newPixcelBuffer); } if (newSampleBuffer) { CFRelease(newSampleBuffer); } } } else { // Fallback on earlier versions [self.session pushVideoBuffer:buffer timeStamp:timeStamp]; } }
- 推流蛆挫≡叱校可以參考 LFLiveKit ,需要注意的是前面說的緩沖區(qū)大小設(shè)置悴侵,還有就是里面
LibRTMP
調(diào)整輸出塊大小瞧剖,減少CPU消耗。