最近通過LFViewKit和ijkPlayer實現(xiàn)了從AR視頻采集炊林、推流到視頻播放的一套流程迅皇,寫個筆記記錄一下實現(xiàn)的流程和對LFViewKit代碼的理解和使用稽穆。由于能力有限澜建,如發(fā)現(xiàn)錯誤歡迎指正送矩。
采集端
音視頻采集
在iOS中音視頻采集可以通過AVFoundation中的相機功能也可以通過ARKit框架或者RealityKit框架獲取蚕甥,通過ARKit和RealityKit可以添加一些效果,這里采用ARKit進行采集栋荸。對于ARKit的使用可以去學習一下相關(guān)的內(nèi)容菇怀,不說廢話直接上代碼:
@property (nonatomic, strong) ARSCNView * scnView;
@property (nonatomic, strong) SCNScene * scene;
@property (nonatomic, strong) SCNNode * sunNode;
@property (nonatomic, strong) ARSession * session;
@property (nonatomic, strong) ARWorldTrackingConfiguration * config;
@property (nonatomic, strong) PFLiveSession * videoSession;
- (void)loadScnView
{
self.scnView = [[ARSCNView alloc] initWithFrame:CGRectMake(0, 0, ScreenSize.width, ScreenSize.height)];
[self.view addSubview:self.scnView];
// self.scnView.allowsCameraControl = YES;
self.scnView.showsStatistics = YES;
self.scnView.delegate = self;
self.session = [[ARSession alloc] init];
self.scnView.session = self.session;
self.scnView.session.delegate = self;
self.session.delegate = self;
[self loadMode];
self.config = [[ARWorldTrackingConfiguration alloc] init];
self.config.planeDetection = ARPlaneDetectionHorizontal; // 設置主要監(jiān)測平面
self.config.lightEstimationEnabled = YES; // 是否支持現(xiàn)實光照補給
self.config.providesAudioData = YES; // 配置支持音頻
[self.session runWithConfiguration:self.config];
}
// 添加ar球體
- (void)loadMode
{
SCNSphere * sunSphere = [SCNSphere sphereWithRadius:0.2];
sunSphere.firstMaterial.multiply.contents = @"art.scnassets/earth/sun.jpg";
sunSphere.firstMaterial.diffuse.contents = @"art.scnassets/earth/sun.jpg";
sunSphere.firstMaterial.multiply.intensity = 0.5;
sunSphere.firstMaterial.lightingModelName = SCNLightingModelConstant;
self.sunNode = [[SCNNode alloc] init];
self.sunNode.geometry = sunSphere;
self.sunNode.position = SCNVector3Make(0, 0, -2);
[self.scnView.scene.rootNode addChildNode:self.sunNode];
SCNAction * act = [SCNAction repeatActionForever:[SCNAction rotateByX:0 y:1 z:0 duration:1]];
[_sunNode runAction:act];
}
// 代理回調(diào)捕獲音頻和視頻
- (void)session:(ARSession *)session didOutputAudioSampleBuffer:(CMSampleBufferRef)audioSampleBuffer
{
[self.videoSession captureOutputAudioData:audioSampleBuffer];
}
// 通過該方法讀取每一幀arkit處理后的圖片凭舶,self.session.currentFrame.capturedImage獲取的圖片是不包含ar元素的圖片
- (void)renderer:(id<SCNSceneRenderer>)renderer updateAtTime:(NSTimeInterval)time
{
if (renderer.currentRenderPassDescriptor.colorAttachments[0].texture == nil) {
return;
}
CVPixelBufferRef pixelBuffer = nil;
if (renderer.currentRenderPassDescriptor.colorAttachments[0].texture.iosurface == nil) {
return;
}
CVPixelBufferCreateWithIOSurface(kCFAllocatorDefault, renderer.currentRenderPassDescriptor.colorAttachments[0].texture.iosurface, nil, &pixelBuffer);
[self.videoSession captureOutputPixelBuffer:pixelBuffer];
}
在LFViewKit相機采用了基于AVFoundation實現(xiàn)的GPUImage,數(shù)據(jù)通過LFLiveSession進行交換爱沟,所以這里只需要將LFLiveSession中的GPUImage數(shù)據(jù)源替換成自己的ARKit數(shù)據(jù)源即可帅霜。
編碼
視頻編碼采用h264編碼,在LFViewKit中視頻編碼會根據(jù)系統(tǒng)版本采用軟編碼和硬編碼呼伸,由于目前適配的系統(tǒng)基本都在iOS 8以后身冀,所以這里拋棄了軟編碼括享,直接采取硬編碼搂根,代碼如下:
- 對音視頻參數(shù)進行配置
//
PFLiveAudioConfiguration *audioConfiguration = [PFLiveAudioConfiguration new]; //
// 設置音頻相關(guān)
audioConfiguration.numberOfChannels = 2; // 設置聲道數(shù)
audioConfiguration.audioBitrate = PFLiveAudioBitRate_128Kbps; // 設置音頻的碼率
audioConfiguration.audioSampleRate = PFLiveAudioSampleRate_44100Hz; //音頻采樣率
// 配置視頻相關(guān)
PFLiveVideoConfiguration *videoConfiguration = [PFLiveVideoConfiguration new];
videoConfiguration.videoSize = ScreenSize; // 視頻尺寸
videoConfiguration.videoBitRate = 800*1024; //視頻碼率,比特率 Bit Rate或叫位速率铃辖,是單位時間內(nèi)視頻(或音頻)的數(shù)據(jù)量剩愧,單位是 bps (bit per second,位每秒)娇斩,一般使用 kbps(千位每秒)或Mbps(百萬位每秒)隙咸。
videoConfiguration.videoMaxBitRate = 1000*1024; // 最大碼率
videoConfiguration.videoMinBitRate = 500*1024; // 最小碼率
videoConfiguration.videoFrameRate = 15; // 幀率,即fps
videoConfiguration.videoMaxKeyframeInterval = 30; // 最大關(guān)鍵幀間隔成洗,可設定為 fps 的2倍五督,影響一個 gop 的大小
videoConfiguration.outputImageOrientation = UIInterfaceOrientationPortrait; //視頻輸出方向
videoConfiguration.sessionPreset = PFCaptureSessionPreset360x640; //視頻分辨率(都是16:9 當此設備不支持當前分辨率,自動降低一級)
- 視頻編碼
// 創(chuàng)建視頻編碼器并設置其參數(shù)
- (void)resetCompressionSession {
if (compressionSession) {
// 當需要主動停止編碼時瓶殃,可調(diào)用下面方法來強制停止編碼器
VTCompressionSessionCompleteFrames(compressionSession, kCMTimeInvalid);
// 釋放編碼會話及內(nèi)存
VTCompressionSessionInvalidate(compressionSession);
// CFRelease(compressionSession);
compressionSession = NULL;
}
//創(chuàng)建編碼會話session
OSStatus status = VTCompressionSessionCreate(NULL, _configuration.videoSize.width, _configuration.videoSize.height, kCMVideoCodecType_H264, NULL, NULL, NULL, VideoCompressonOutputCallback, (__bridge void *)self, &compressionSession);
if (status != noErr) {
return;
}
_currentVideoBitRate = _configuration.videoBitRate;
// 關(guān)鍵幀之間的最大間隔充包,也稱為關(guān)鍵幀速率
VTSessionSetProperty(compressionSession, kVTCompressionPropertyKey_MaxKeyFrameInterval, (__bridge CFTypeRef)@(_configuration.videoMaxKeyframeInterval));
// 從這個關(guān)鍵幀到下一個關(guān)鍵幀的最長持續(xù)時間
VTSessionSetProperty(compressionSession, kVTCompressionPropertyKey_MaxKeyFrameIntervalDuration, (__bridge CFTypeRef)@(_configuration.videoMaxKeyframeInterval/_configuration.videoFrameRate));
// 預期的幀速率
VTSessionSetProperty(compressionSession, kVTCompressionPropertyKey_ExpectedFrameRate, (__bridge CFTypeRef)@(_configuration.videoFrameRate));
// 期望的平均比特率,以比特/秒為單位
VTSessionSetProperty(compressionSession, kVTCompressionPropertyKey_AverageBitRate, (__bridge CFTypeRef)@(_configuration.videoBitRate));
//
NSArray *limit = @[@(_configuration.videoBitRate * 1.5/8), @(1)];
// 碼率上限
VTSessionSetProperty(compressionSession, kVTCompressionPropertyKey_DataRateLimits, (__bridge CFArrayRef)limit);
// 表示是否建議視頻編碼器實時執(zhí)行壓縮
VTSessionSetProperty(compressionSession, kVTCompressionPropertyKey_RealTime, kCFBooleanTrue);
// 編碼比特流的配置文件和級別
VTSessionSetProperty(compressionSession, kVTCompressionPropertyKey_ProfileLevel, kVTProfileLevel_H264_Main_AutoLevel);
// 指示是否啟用了幀重新排序
VTSessionSetProperty(compressionSession, kVTCompressionPropertyKey_AllowFrameReordering, kCFBooleanTrue);
// H.264 壓縮的熵編碼模式遥椿,可以設置為 CAVLC 或者 CABAC
VTSessionSetProperty(compressionSession, kVTCompressionPropertyKey_H264EntropyMode, kVTH264EntropyMode_CABAC);
// 在進行數(shù)據(jù)的編碼之前基矮,可手動調(diào)用下面的方法來申請必要的資源,如果不手動調(diào)用冠场,則會在第一次進行數(shù)據(jù)編碼時自動調(diào)用
VTCompressionSessionPrepareToEncodeFrames(compressionSession);
}
// 設置碼率
- (void)setVideoBitRate:(NSInteger)videoBitRate {
if(_isBackGround) return;
VTSessionSetProperty(compressionSession, kVTCompressionPropertyKey_AverageBitRate, (__bridge CFTypeRef)@(videoBitRate));
// 以字節(jié)為單位
NSArray *limit = @[@(videoBitRate * 1.5/8), @(1)];
VTSessionSetProperty(compressionSession, kVTCompressionPropertyKey_DataRateLimits, (__bridge CFArrayRef)limit);
_currentVideoBitRate = videoBitRate;
}
- (void)encodeVideoData:(CVPixelBufferRef)pixelBuffer timeStamp:(uint64_t)timeStamp {
if(_isBackGround) return;
frameCount++;
CMTime presentationTimeStamp = CMTimeMake(frameCount, (int32_t)_configuration.videoFrameRate);
VTEncodeInfoFlags flags;
CMTime duration = CMTimeMake(1, (int32_t)_configuration.videoFrameRate);
NSDictionary *properties = nil;
if (frameCount % (int32_t)_configuration.videoMaxKeyframeInterval == 0) {
properties = @{(__bridge NSString *)kVTEncodeFrameOptionKey_ForceKeyFrame: @YES};
}
NSNumber *timeNumber = @(timeStamp);
// 該函數(shù)調(diào)用一次之后家浇,后續(xù)的調(diào)用將是無效的調(diào)用。調(diào)用此方法成功后觸發(fā)回調(diào)函數(shù)完成編碼;
// 對視頻幀進行編碼碴裙,并在會話的 VTCompressionOutputCallback 中接收壓縮的視頻幀钢悲。
OSStatus status = VTCompressionSessionEncodeFrame(compressionSession, pixelBuffer, presentationTimeStamp, duration, (__bridge CFDictionaryRef)properties, (__bridge_retained void *)timeNumber, &flags);
if(status != noErr){
[self resetCompressionSession];
}
}
static void VideoCompressonOutputCallback(void *VTref, void *VTFrameRef, OSStatus status, VTEncodeInfoFlags infoFlags, CMSampleBufferRef sampleBuffer){
if (!sampleBuffer) return;
// 從采集到的視頻CMSampleBufferRef中獲取CVImageBufferRef
CFArrayRef array = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, true);
if (!array) return;
//
CFDictionaryRef dic = (CFDictionaryRef)CFArrayGetValueAtIndex(array, 0);
if (!dic) return;
// kCMSampleAttachmentKey_NotSync 獲取是否包含關(guān)鍵幀
BOOL keyframe = !CFDictionaryContainsKey(dic, kCMSampleAttachmentKey_NotSync);
uint64_t timeStamp = [((__bridge_transfer NSNumber *)VTFrameRef) longLongValue];
PFHardwareVideoEncoder *videoEncoder = (__bridge PFHardwareVideoEncoder *)VTref;
if (status != noErr) {
return;
}
// keyframe標明為關(guān)鍵幀,videoEncoder->sps判定是否已經(jīng)存在sps
if (keyframe && !videoEncoder->sps) {
NSLog(@"獲取sps數(shù)據(jù)");
// 獲取數(shù)據(jù)格式描述舔株,如果存在錯誤則返回NULL
CMFormatDescriptionRef format = CMSampleBufferGetFormatDescription(sampleBuffer);
// 創(chuàng)建用于記錄sps數(shù)據(jù)長度
size_t sparameterSetSize, sparameterSetCount;
// 定義的sps set用來存儲sps數(shù)據(jù)
const uint8_t *sparameterSet;
// 并從中返回給定索引處的 NAL 單元莺琳。這些 NAL 單元通常是參數(shù)集(例如 SPS、PPS)载慈。此處傳入0以獲取sps數(shù)據(jù)
OSStatus statusCode = CMVideoFormatDescriptionGetH264ParameterSetAtIndex(format, 0, &sparameterSet, &sparameterSetSize, &sparameterSetCount, 0);
if (statusCode == noErr) {
// 獲取pps相關(guān)數(shù)據(jù)
size_t pparameterSetSize, pparameterSetCount;
const uint8_t *pparameterSet;
OSStatus statusCode = CMVideoFormatDescriptionGetH264ParameterSetAtIndex(format, 1, &pparameterSet, &pparameterSetSize, &pparameterSetCount, 0);
if (statusCode == noErr) {
// 將sps和pps賦值到LFHardwareVideoEncoder上
videoEncoder->sps = [NSData dataWithBytes:sparameterSet length:sparameterSetSize];
videoEncoder->pps = [NSData dataWithBytes:pparameterSet length:pparameterSetSize];
// 對sps和pps進行拼接
if (videoEncoder->enabledWriteVideoFile) {
// 對sps和pps進行頭部拼接并寫入地址惭等,生成NALU
NSMutableData *data = [[NSMutableData alloc] init];
uint8_t header[] = {0x00, 0x00, 0x00, 0x01};
[data appendBytes:header length:4];
[data appendData:videoEncoder->sps];
[data appendBytes:header length:4];
[data appendData:videoEncoder->pps];
fwrite(data.bytes, 1, data.length, videoEncoder->fp);
}
}
}
}
// 獲取到的編碼數(shù)據(jù)
CMBlockBufferRef dataBuffer = CMSampleBufferGetDataBuffer(sampleBuffer);
size_t length, totalLength;
char *dataPointer;
// 讀取dataBuffer中的數(shù)據(jù)到dataPointer中,
OSStatus statusCodeRet = CMBlockBufferGetDataPointer(dataBuffer, 0, &length, &totalLength, &dataPointer);
if (statusCodeRet == noErr) {
size_t bufferOffset = 0;
static const int AVCCHeaderLength = 4;
// 進行循環(huán)讀取dataBuffer中的數(shù)據(jù)
while (bufferOffset < totalLength - AVCCHeaderLength) {
// Read the NAL unit length
uint32_t NALUnitLength = 0;
// 從dataPointer + bufferOffset開始copy AVCCHeaderLength個數(shù)據(jù)到 NALUnitLength中
memcpy(&NALUnitLength, dataPointer + bufferOffset, AVCCHeaderLength);
// 進行大小端調(diào)整
NALUnitLength = CFSwapInt32BigToHost(NALUnitLength);
// 進行格式配置
PFVideoFrame *videoFrame = [PFVideoFrame new];
videoFrame.timestamp = timeStamp;
videoFrame.data = [[NSData alloc] initWithBytes:(dataPointer + bufferOffset + AVCCHeaderLength) length:NALUnitLength];
videoFrame.isKeyFrame = keyframe;
videoFrame.sps = videoEncoder->sps;
videoFrame.pps = videoEncoder->pps;
if (videoEncoder.h264Delegate && [videoEncoder.h264Delegate respondsToSelector:@selector(videoEncoder:videoFrame:)]) {
[videoEncoder.h264Delegate videoEncoder:videoEncoder videoFrame:videoFrame];
}
// 進行數(shù)據(jù)寫入办铡,生成NALU
if (videoEncoder->enabledWriteVideoFile) {
NSMutableData *data = [[NSMutableData alloc] init];
if (keyframe) { // 關(guān)鍵幀的處理
uint8_t header[] = {0x00, 0x00, 0x00, 0x01};
[data appendBytes:header length:4];
} else {
// 非關(guān)鍵幀的處理
uint8_t header[] = {0x00, 0x00, 0x01};
[data appendBytes:header length:3];
}
// 進行數(shù)據(jù)拼接
[data appendData:videoFrame.data];
fwrite(data.bytes, 1, data.length, videoEncoder->fp);
}
// 對讀取的數(shù)據(jù)進行++操作
bufferOffset += AVCCHeaderLength + NALUnitLength;
}
}
}
- 音頻編碼
- (void)encodeAudioData:(nullable NSData*)audioData timeStamp:(uint64_t)timeStamp {
if (![self createAudioConvert]) {
return;
}
if(leftLength + audioData.length >= self.configuration.bufferLength){
///< 發(fā)送
NSInteger totalSize = leftLength + audioData.length;
NSInteger encodeCount = totalSize/self.configuration.bufferLength;
char *totalBuf = malloc(totalSize);
char *p = totalBuf;
memset(totalBuf, (int)totalSize, 0);
memcpy(totalBuf, leftBuf, leftLength);
memcpy(totalBuf + leftLength, audioData.bytes, audioData.length);
for(NSInteger index = 0;index < encodeCount;index++){
[self encodeBuffer:p timeStamp:timeStamp];
p += self.configuration.bufferLength;
}
// 保留多余l(xiāng)ength的數(shù)據(jù)下次處理
leftLength = totalSize%self.configuration.bufferLength;
memset(leftBuf, 0, self.configuration.bufferLength);
memcpy(leftBuf, totalBuf + (totalSize -leftLength), leftLength);
free(totalBuf);
}else{
///< 積累
memcpy(leftBuf+leftLength, audioData.bytes, audioData.length);
leftLength = leftLength + audioData.length;
}
}
- (void)encodeBuffer:(char*)buf timeStamp:(uint64_t)timeStamp{
AudioBuffer inBuffer;
inBuffer.mNumberChannels = 1;
inBuffer.mData = buf;
inBuffer.mDataByteSize = (UInt32)self.configuration.bufferLength;
AudioBufferList buffers;
buffers.mNumberBuffers = 1;
buffers.mBuffers[0] = inBuffer;
AudioBufferList outBufferList;
outBufferList.mNumberBuffers = 1;
outBufferList.mBuffers[0].mNumberChannels = inBuffer.mNumberChannels;
outBufferList.mBuffers[0].mDataByteSize = inBuffer.mDataByteSize; // 設置緩沖區(qū)大小
outBufferList.mBuffers[0].mData = aacBuf; // 設置AAC緩沖區(qū)
UInt32 outputDataPacketSize = 1;
if (AudioConverterFillComplexBuffer(m_converter, inputDataProc, &buffers, &outputDataPacketSize, &outBufferList, NULL) != noErr) {
return;
}
PFAudioFrame *audioFrame = [PFAudioFrame new];
audioFrame.timestamp = timeStamp;
audioFrame.data = [NSData dataWithBytes:aacBuf length:outBufferList.mBuffers[0].mDataByteSize];
char exeData[2];
exeData[0] = _configuration.asc[0];
exeData[1] = _configuration.asc[1];
audioFrame.audioInfo = [NSData dataWithBytes:exeData length:2];
if (self.aacDeleage && [self.aacDeleage respondsToSelector:@selector(audioEncoder:audioFrame:)]) {
[self.aacDeleage audioEncoder:self audioFrame:audioFrame];
}
if (self->enabledWriteVideoFile) {
NSData *adts = [self adtsData:_configuration.numberOfChannels rawDataLength:audioFrame.data.length];
fwrite(adts.bytes, 1, adts.length, self->fp);
fwrite(audioFrame.data.bytes, 1, audioFrame.data.length, self->fp);
}
}
- (void)stopEncoder {
}
#pragma mark -- CustomMethod
- (BOOL)createAudioConvert { //根據(jù)輸入樣本初始化一個編碼轉(zhuǎn)換器
if (m_converter != nil) {
return TRUE;
}
AudioStreamBasicDescription inputFormat = {0};
inputFormat.mSampleRate = _configuration.audioSampleRate;
inputFormat.mFormatID = kAudioFormatLinearPCM;
inputFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagsNativeEndian | kAudioFormatFlagIsPacked;
inputFormat.mChannelsPerFrame = (UInt32)_configuration.numberOfChannels;
inputFormat.mFramesPerPacket = 1;
inputFormat.mBitsPerChannel = 16;
inputFormat.mBytesPerFrame = inputFormat.mBitsPerChannel / 8 * inputFormat.mChannelsPerFrame;
inputFormat.mBytesPerPacket = inputFormat.mBytesPerFrame * inputFormat.mFramesPerPacket;
AudioStreamBasicDescription outputFormat; // 這里開始是輸出音頻格式
memset(&outputFormat, 0, sizeof(outputFormat));
outputFormat.mSampleRate = inputFormat.mSampleRate; // 采樣率保持一致
outputFormat.mFormatID = kAudioFormatMPEG4AAC; // AAC編碼 kAudioFormatMPEG4AAC kAudioFormatMPEG4AAC_HE_V2
outputFormat.mChannelsPerFrame = (UInt32)_configuration.numberOfChannels;;
outputFormat.mFramesPerPacket = 1024; // AAC一幀是1024個字節(jié)
const OSType subtype = kAudioFormatMPEG4AAC;
AudioClassDescription requestedCodecs[2] = {
{
kAudioEncoderComponentType,
subtype,
kAppleSoftwareAudioCodecManufacturer
},
{
kAudioEncoderComponentType,
subtype,
kAppleHardwareAudioCodecManufacturer
}
};
OSStatus result = AudioConverterNewSpecific(&inputFormat, &outputFormat, 2, requestedCodecs, &m_converter);;
UInt32 outputBitrate = _configuration.audioBitrate;
UInt32 propSize = sizeof(outputBitrate);
if(result == noErr) {
result = AudioConverterSetProperty(m_converter, kAudioConverterEncodeBitRate, propSize, &outputBitrate);
}
return YES;
}
#pragma mark -- AudioCallBack
OSStatus inputDataProc(AudioConverterRef inConverter, UInt32 *ioNumberDataPackets, AudioBufferList *ioData, AudioStreamPacketDescription * *outDataPacketDescription, void *inUserData) { //<span style="font-family: Arial, Helvetica, sans-serif;">AudioConverterFillComplexBuffer 編碼過程中辞做,會要求這個函數(shù)來填充輸入數(shù)據(jù)琳要,也就是原始PCM數(shù)據(jù)</span>
AudioBufferList bufferList = *(AudioBufferList *)inUserData;
ioData->mBuffers[0].mNumberChannels = 1;
ioData->mBuffers[0].mData = bufferList.mBuffers[0].mData;
ioData->mBuffers[0].mDataByteSize = bufferList.mBuffers[0].mDataByteSize;
return noErr;
}
推流
推流部分使用了LFViewKit中的pili-librtmp,其中關(guān)于librtmp庫的注釋主要來自于文章使用librtmp庫進行推流與拉流秤茅。代碼如下:
- 碼率控制
#import "PFStreamingBuffer.h"
#import "NSMutableArray+PFAdd.h"
static const NSUInteger defaultSortBufferMaxCount = 5;///< 排序10個內(nèi)
static const NSUInteger defaultUpdateInterval = 1;///< 更新頻率為1s
static const NSUInteger defaultCallBackInterval = 5;///< 5s計時一次 5秒為一個網(wǎng)絡監(jiān)控周期
static const NSUInteger defaultSendBufferMaxCount = 600;///< 最大緩沖區(qū)為600
@interface PFStreamingBuffer (){
dispatch_semaphore_t _lock;
}
@property (nonatomic, strong) NSMutableArray <PFFrame *> *sortList;
@property (nonatomic, strong, readwrite) NSMutableArray <PFFrame *> *list;
@property (nonatomic, strong) NSMutableArray *thresholdList;
/** 處理buffer緩沖區(qū)情況 */
@property (nonatomic, assign) NSInteger currentInterval; //
@property (nonatomic, assign) NSInteger callBackInterval; //
@property (nonatomic, assign) NSInteger updateInterval; //
@property (nonatomic, assign) BOOL startTimer; // 開始時間
@end
@implementation PFStreamingBuffer
- (instancetype)init {
if (self = [super init]) {
_lock = dispatch_semaphore_create(1);
self.updateInterval = defaultUpdateInterval;
self.callBackInterval = defaultCallBackInterval;
self.maxCount = defaultSendBufferMaxCount;
self.lastDropFrames = 0;
self.startTimer = NO;
}
return self;
}
#pragma mark -- Custom
- (void)appendObject:(PFFrame *)frame {
if (!frame) return;
if (!_startTimer) {
_startTimer = YES;
[self tick]; // 開啟監(jiān)控
}
dispatch_semaphore_wait(_lock, DISPATCH_TIME_FOREVER);
if (self.sortList.count < defaultSortBufferMaxCount) { // 當緩沖區(qū)小于設置的最大緩沖數(shù)量時將新的frame加入到緩沖區(qū)
[self.sortList addObject:frame];
} else {
///< 排序
[self.sortList addObject:frame];
[self.sortList sortUsingFunction:frameDataCompare context:nil]; // 將數(shù)據(jù)進行排序
/// 丟幀
[self removeExpireFrame];
/// 添加至緩沖區(qū)
PFFrame *firstFrame = [self.sortList pfPopFirstObject];
if (firstFrame) [self.list addObject:firstFrame];
}
dispatch_semaphore_signal(_lock);
}
- (PFFrame *)popFirstObject {
dispatch_semaphore_wait(_lock, DISPATCH_TIME_FOREVER);
PFFrame *firstFrame = [self.list pfPopFirstObject];
dispatch_semaphore_signal(_lock);
return firstFrame;
}
- (void)removeAllObject {
dispatch_semaphore_wait(_lock, DISPATCH_TIME_FOREVER);
[self.list removeAllObjects];
dispatch_semaphore_signal(_lock);
}
//
- (void)removeExpireFrame {
if (self.list.count < self.maxCount) return; // 緩沖區(qū)數(shù)據(jù)小于設置的最大緩沖長度
NSArray *pFrames = [self expirePFrames];///< 第一個P到第一個I之間的p幀
self.lastDropFrames += [pFrames count];
if (pFrames && pFrames.count > 0) {
[self.list removeObjectsInArray:pFrames];
return;
}
NSArray *iFrames = [self expireIFrames];///< 刪除一個I幀(但一個I幀可能對應多個nal)
self.lastDropFrames += [iFrames count];
if (iFrames && iFrames.count > 0) {
[self.list removeObjectsInArray:iFrames];
return;
}
[self.list removeAllObjects];
}
// 獲取過時的frame稚补, 如果當前第一幀是I幀則刪除當前I幀到下一個I幀之間的數(shù)據(jù),如果當前幀不是I幀則刪除第一個I幀之前的數(shù)據(jù)
- (NSArray *)expirePFrames {
NSMutableArray *pframes = [[NSMutableArray alloc] init];
for (NSInteger index = 0; index < self.list.count; index++) {
PFFrame *frame = [self.list objectAtIndex:index];
if ([frame isKindOfClass:[PFVideoFrame class]]) {
PFVideoFrame *videoFrame = (PFVideoFrame *)frame;
if (videoFrame.isKeyFrame && pframes.count > 0) {
break;
} else if (!videoFrame.isKeyFrame) {
[pframes addObject:frame];
}
}
}
return pframes;
}
//
- (NSArray *)expireIFrames {
NSMutableArray *iframes = [[NSMutableArray alloc] init];
uint64_t timeStamp = 0;
for (NSInteger index = 0; index < self.list.count; index++) {
PFFrame *frame = [self.list objectAtIndex:index];
// 獲取當前第一個I幀
if ([frame isKindOfClass:[PFVideoFrame class]] && ((PFVideoFrame *)frame).isKeyFrame) {
if (timeStamp != 0 && timeStamp != frame.timestamp) {
break;
}
[iframes addObject:frame];
timeStamp = frame.timestamp;
}
}
return iframes;
}
//
NSInteger frameDataCompare(id obj1, id obj2, void *context){
PFFrame *frame1 = (PFFrame *)obj1;
PFFrame *frame2 = (PFFrame *)obj2;
if (frame1.timestamp == frame2.timestamp) {
return NSOrderedSame;
}else if (frame1.timestamp > frame2.timestamp){
return NSOrderedDescending;
}
return NSOrderedAscending;
}
// 根據(jù)五次采樣 self.List中數(shù)據(jù)量進行對比嫂伞,如果其中的數(shù)據(jù)逐漸增加則increaseCount會增加孔厉,則需要降低碼率
// 如果其中數(shù)據(jù)量越來越小拯钻,則decreaseCount會增加帖努,需要增加碼率
- (PFLiveBuffferState)currentBufferState {
NSInteger currentCount = 0;
NSInteger increaseCount = 0;
NSInteger decreaseCount = 0;
NSLog(@"個數(shù):%ld", self.thresholdList.count);
for (NSNumber *number in self.thresholdList) {
NSLog(@"number:%ld--currentCount:%ld--increaseCount:%ld--decreaseCount:%ld", number.integerValue, currentCount, increaseCount, decreaseCount);
if (number.integerValue > currentCount) {
// 需要降低碼率
increaseCount++;
} else{
// 需要增大碼率
decreaseCount++;
}
currentCount = [number integerValue];
}
if (increaseCount >= self.callBackInterval) {
// 降低碼率
NSLog(@"降低碼率");
return PFLiveBuffferIncrease;
}
if (decreaseCount >= self.callBackInterval) {
// 提升碼率
NSLog(@"提升碼率");
return PFLiveBuffferDecline;
}
return PFLiveBuffferUnknown;
}
#pragma mark -- Setter Getter
- (NSMutableArray *)list {
if (!_list) {
_list = [[NSMutableArray alloc] init];
}
return _list;
}
- (NSMutableArray *)sortList {
if (!_sortList) {
_sortList = [[NSMutableArray alloc] init];
}
return _sortList;
}
- (NSMutableArray *)thresholdList {
if (!_thresholdList) {
_thresholdList = [[NSMutableArray alloc] init];
}
return _thresholdList;
}
#pragma mark -- 采樣
- (void)tick {
/** 采樣 3個階段 如果網(wǎng)絡都是好或者都是差給回調(diào) */
_currentInterval += self.updateInterval;
dispatch_semaphore_wait(_lock, DISPATCH_TIME_FOREVER);
[self.thresholdList addObject:@(self.list.count)];
dispatch_semaphore_signal(_lock);
// NSLog(@"currentInterval:%ld--callBackInterval:%ld--updateInterval:%ld", self.currentInterval, self.callBackInterval, self.updateInterval);
if (self.currentInterval >= self.callBackInterval) { //當當前時間間隔大于等于5時
PFLiveBuffferState state = [self currentBufferState];
if (state == PFLiveBuffferIncrease) {
if (self.delegate && [self.delegate respondsToSelector:@selector(streamingBuffer:bufferState:)]) {
[self.delegate streamingBuffer:self bufferState:PFLiveBuffferIncrease];
}
} else if (state == PFLiveBuffferDecline) {
if (self.delegate && [self.delegate respondsToSelector:@selector(streamingBuffer:bufferState:)]) {
// 將網(wǎng)絡狀態(tài)回調(diào)給session以進行碼率調(diào)節(jié)
[self.delegate streamingBuffer:self bufferState:PFLiveBuffferDecline];
}
}
self.currentInterval = 0;
[self.thresholdList removeAllObjects];
}
__weak typeof(self) _self = self;
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(self.updateInterval * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
__strong typeof(_self) self = _self;
[self tick];
});
}
@end
- 推流部分
- (nullable instancetype)initWithStream:(nullable PFLiveStreamInfo *)stream{
return [self initWithStream:stream reconnectInterval:0 reconnectCount:0];
}
- (nullable instancetype)initWithStream:(nullable PFLiveStreamInfo *)stream reconnectInterval:(NSInteger)reconnectInterval reconnectCount:(NSInteger)reconnectCount{
if (!stream) @throw [NSException exceptionWithName:@"LFStreamRtmpSocket init error" reason:@"stream is nil" userInfo:nil];
if (self = [super init]) {
_stream = stream;
if (reconnectInterval > 0) _reconnectInterval = reconnectInterval;
else _reconnectInterval = RetryTimesMargin;
if (reconnectCount > 0) _reconnectCount = reconnectCount;
else _reconnectCount = RetryTimesBreaken;
[self addObserver:self forKeyPath:@"isSending" options:NSKeyValueObservingOptionNew context:nil];//這里改成observer主要考慮一直到發(fā)送出錯情況下,可以繼續(xù)發(fā)送
}
return self;
}
- (void)dealloc{
[self removeObserver:self forKeyPath:@"isSending"];
}
- (void)start {
dispatch_async(self.rtmpSendQueue, ^{
[self _start];
});
}
- (void)_start {
if (!_stream) return;
if (_isConnecting) return;
if (_rtmp != NULL) return;
self.debugInfo.streamId = self.stream.streamId;
self.debugInfo.uploadUrl = self.stream.url;
self.debugInfo.isRtmp = YES;
if (_isConnecting) return;
_isConnecting = YES;
if (self.delegate && [self.delegate respondsToSelector:@selector(socketStatus:status:)]) {
// 回調(diào)當前長鏈接狀態(tài)為正在連接
[self.delegate socketStatus:self status:PFLivePending];
}
if (_rtmp != NULL) { // 如果當前存在鏈接粪般,則將當前鏈接銷毀
PILI_RTMP_Close(_rtmp, &_error);
PILI_RTMP_Free(_rtmp);
}
// 鏈接遠程服務器
[self RTMP264_Connect:(char *)[_stream.url cStringUsingEncoding:NSASCIIStringEncoding]];
}
// 停止push
- (void)stop {
dispatch_async(self.rtmpSendQueue, ^{
[self _stop];
[NSObject cancelPreviousPerformRequestsWithTarget:self];
});
}
- (void)_stop {
if (self.delegate && [self.delegate respondsToSelector:@selector(socketStatus:status:)]) {
[self.delegate socketStatus:self status:PFLiveStop];
}
if (_rtmp != NULL) {
PILI_RTMP_Close(_rtmp, &_error);
PILI_RTMP_Free(_rtmp);
_rtmp = NULL;
}
[self clean];
}
- (void)sendFrame:(PFFrame *)frame {
if (!frame) return;
// 將幀數(shù)據(jù)放入數(shù)據(jù)隊列中
[self.buffer appendObject:frame];
if(!self.isSending){
[self sendFrame];
}
}
- (void)setDelegate:(id<PFStreamSocketDelegate>)delegate {
_delegate = delegate;
}
#pragma mark -- CustomMethod
- (void)sendFrame {
__weak typeof(self) _self = self;
dispatch_async(self.rtmpSendQueue, ^{
if (!_self.isSending && _self.buffer.list.count > 0) {
_self.isSending = YES;
if (!_self.isConnected || _self.isReconnecting || _self.isConnecting || !_rtmp){ // 判斷 是否建立連接/是否在重連/是否在連接中/rtmp是否存在
_self.isSending = NO;
return;
}
// 吐出首個數(shù)據(jù)
PFFrame *frame = [_self.buffer popFirstObject];
if ([frame isKindOfClass:[PFVideoFrame class]]) { // 如果是視頻數(shù)據(jù)
// 如果沒有發(fā)送過header數(shù)據(jù)就優(yōu)先發(fā)送header數(shù)據(jù)
if (!_self.sendVideoHead) {
_self.sendVideoHead = YES;
if(!((PFVideoFrame*)frame).sps || !((PFVideoFrame*)frame).pps){
_self.isSending = NO;
return;
}
// 先發(fā)送header數(shù)據(jù)
[_self sendVideoHeader:(PFVideoFrame *)frame];
} else {
// 發(fā)送非header視頻數(shù)據(jù)
[_self sendVideo:(PFVideoFrame *)frame];
}
} else { // 如果是音頻數(shù)據(jù)
if (!_self.sendAudioHead) {
_self.sendAudioHead = YES;
if(!((PFAudioFrame*)frame).audioInfo){
_self.isSending = NO;
return;
}
[_self sendAudioHeader:(PFAudioFrame *)frame];
} else {
[_self sendAudio:frame];
}
}
//debug更新
_self.debugInfo.totalFrame++;
_self.debugInfo.dropFrame += _self.buffer.lastDropFrames;
_self.buffer.lastDropFrames = 0;
_self.debugInfo.dataFlow += frame.data.length;
_self.debugInfo.elapsedMilli = CACurrentMediaTime() * 1000 - _self.debugInfo.timeStamp;
if (_self.debugInfo.elapsedMilli < 1000) {
_self.debugInfo.bandwidth += frame.data.length;
if ([frame isKindOfClass:[PFAudioFrame class]]) {
_self.debugInfo.capturedAudioCount++;
} else {
_self.debugInfo.capturedVideoCount++;
}
_self.debugInfo.unSendCount = _self.buffer.list.count;
} else {
_self.debugInfo.currentBandwidth = _self.debugInfo.bandwidth;
_self.debugInfo.currentCapturedAudioCount = _self.debugInfo.capturedAudioCount;
_self.debugInfo.currentCapturedVideoCount = _self.debugInfo.capturedVideoCount;
if (_self.delegate && [_self.delegate respondsToSelector:@selector(socketDebug:debugInfo:)]) {
[_self.delegate socketDebug:_self debugInfo:_self.debugInfo];
}
_self.debugInfo.bandwidth = 0;
_self.debugInfo.capturedAudioCount = 0;
_self.debugInfo.capturedVideoCount = 0;
_self.debugInfo.timeStamp = CACurrentMediaTime() * 1000;
}
//修改發(fā)送狀態(tài)
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
//< 這里只為了不循環(huán)調(diào)用sendFrame方法 調(diào)用棧是保證先出棧再進棧
_self.isSending = NO;
});
}
});
}
- (void)clean {
_isConnecting = NO;
_isReconnecting = NO;
_isSending = NO;
_isConnected = NO;
_sendAudioHead = NO;
_sendVideoHead = NO;
self.debugInfo = nil;
[self.buffer removeAllObject];
self.retryTimes4netWorkBreaken = 0;
}
// 進行連接
- (NSInteger)RTMP264_Connect:(char *)push_url {
_rtmp = PILI_RTMP_Alloc();
PILI_RTMP_Init(_rtmp);
//設置會話參數(shù)
if (PILI_RTMP_SetupURL(_rtmp, push_url, &_error) == FALSE) {
//log(LOG_ERR, "RTMP_SetupURL() failed!");
goto Failed;
}
// 設置錯誤拼余、連接回調(diào)
_rtmp->m_errorCallback = RTMPErrorCallback;
_rtmp->m_connCallback = ConnectionTimeCallback;
_rtmp->m_userData = (__bridge void *)self;
_rtmp->m_msgCounter = 1;
_rtmp->Link.timeout = RTMP_RECEIVE_TIMEOUT; // 鏈接超時時間
//調(diào)用該方法為推流,否則為拉流
PILI_RTMP_EnableWrite(_rtmp);
//建立RTMP鏈接中的網(wǎng)絡連接(NetConnection)
if (PILI_RTMP_Connect(_rtmp, NULL, &_error) == FALSE) {
goto Failed;
}
//建立RTMP鏈接中的網(wǎng)絡流(NetStream
if (PILI_RTMP_ConnectStream(_rtmp, 0, &_error) == FALSE) {
goto Failed;
}
// 代理將已經(jīng)開始推流的狀態(tài)返回給前端
if (self.delegate && [self.delegate respondsToSelector:@selector(socketStatus:status:)]) {
[self.delegate socketStatus:self status:PFLiveStart];
}
[self sendMetaData];
_isConnected = YES;
_isConnecting = NO;
_isReconnecting = NO;
_isSending = NO;
return 0;
Failed:
PILI_RTMP_Close(_rtmp, &_error);
PILI_RTMP_Free(_rtmp);
_rtmp = NULL;
[self reconnect];
return -1;
}
#pragma mark -- Rtmp Send
- (void)sendMetaData {
PILI_RTMPPacket packet;
char pbuf[2048], *pend = pbuf + sizeof(pbuf);
packet.m_nChannel = 0x03; // control channel (invoke)
packet.m_headerType = RTMP_PACKET_SIZE_LARGE; // 數(shù)據(jù)包大小
packet.m_packetType = RTMP_PACKET_TYPE_INFO; // 數(shù)據(jù)包類型
packet.m_nTimeStamp = 0; // 輸入時的時間戳
packet.m_nInfoField2 = _rtmp->m_stream_id; //
packet.m_hasAbsTimestamp = TRUE; // 是否絕對時間戳
packet.m_body = pbuf + RTMP_MAX_HEADER_SIZE;
char *enc = packet.m_body;
enc = AMF_EncodeString(enc, pend, &av_setDataFrame);
enc = AMF_EncodeString(enc, pend, &av_onMetaData);
*enc++ = AMF_OBJECT;
enc = AMF_EncodeNamedNumber(enc, pend, &av_duration, 0.0);
enc = AMF_EncodeNamedNumber(enc, pend, &av_fileSize, 0.0);
// videosize
enc = AMF_EncodeNamedNumber(enc, pend, &av_width, _stream.videoConfiguration.videoSize.width);
enc = AMF_EncodeNamedNumber(enc, pend, &av_height, _stream.videoConfiguration.videoSize.height);
// video
enc = AMF_EncodeNamedString(enc, pend, &av_videocodecid, &av_avc1);
enc = AMF_EncodeNamedNumber(enc, pend, &av_videodatarate, _stream.videoConfiguration.videoBitRate / 1000.f);
enc = AMF_EncodeNamedNumber(enc, pend, &av_framerate, _stream.videoConfiguration.videoFrameRate);
// audio
enc = AMF_EncodeNamedString(enc, pend, &av_audiocodecid, &av_mp4a);
enc = AMF_EncodeNamedNumber(enc, pend, &av_audiodatarate, _stream.audioConfiguration.audioBitrate);
enc = AMF_EncodeNamedNumber(enc, pend, &av_audiosamplerate, _stream.audioConfiguration.audioSampleRate);
enc = AMF_EncodeNamedNumber(enc, pend, &av_audiosamplesize, 16.0);
enc = AMF_EncodeNamedBoolean(enc, pend, &av_stereo, _stream.audioConfiguration.numberOfChannels == 2);
// sdk version
enc = AMF_EncodeNamedString(enc, pend, &av_encoder, &av_SDKVersion);
*enc++ = 0;
*enc++ = 0;
*enc++ = AMF_OBJECT_END;
packet.m_nBodySize = (uint32_t)(enc - packet.m_body);
if (!PILI_RTMP_SendPacket(_rtmp, &packet, FALSE, &_error)) {
return;
}
}
- (void)sendVideoHeader:(PFVideoFrame *)videoFrame {
unsigned char *body = NULL;
NSInteger iIndex = 0;
NSInteger rtmpLength = 1024;
const char *sps = videoFrame.sps.bytes;
const char *pps = videoFrame.pps.bytes;
NSInteger sps_len = videoFrame.sps.length;
NSInteger pps_len = videoFrame.pps.length;
body = (unsigned char *)malloc(rtmpLength);
memset(body, 0, rtmpLength);
body[iIndex++] = 0x17;
body[iIndex++] = 0x00;
body[iIndex++] = 0x00;
body[iIndex++] = 0x00;
body[iIndex++] = 0x00;
body[iIndex++] = 0x01;
body[iIndex++] = sps[1];
body[iIndex++] = sps[2];
body[iIndex++] = sps[3];
body[iIndex++] = 0xff;
// 切換大小端模式
/*sps*/
body[iIndex++] = 0xe1;
body[iIndex++] = (sps_len >> 8) & 0xff;
body[iIndex++] = sps_len & 0xff;
memcpy(&body[iIndex], sps, sps_len);
iIndex += sps_len;
/*pps*/
body[iIndex++] = 0x01;
body[iIndex++] = (pps_len >> 8) & 0xff;
body[iIndex++] = (pps_len) & 0xff;
memcpy(&body[iIndex], pps, pps_len);
iIndex += pps_len;
[self sendPacket:RTMP_PACKET_TYPE_VIDEO data:body size:iIndex nTimestamp:0];
free(body);
}
- (void)sendVideo:(PFVideoFrame *)frame {
NSInteger i = 0;
NSInteger rtmpLength = frame.data.length + 9;
unsigned char *body = (unsigned char *)malloc(rtmpLength);
memset(body, 0, rtmpLength);
if (frame.isKeyFrame) {
body[i++] = 0x17; // 1:Iframe 7:AVC
} else {
body[i++] = 0x27; // 2:Pframe 7:AVC
}
body[i++] = 0x01; // AVC NALU
body[i++] = 0x00;
body[i++] = 0x00;
body[i++] = 0x00;
body[i++] = (frame.data.length >> 24) & 0xff;
body[i++] = (frame.data.length >> 16) & 0xff;
body[i++] = (frame.data.length >> 8) & 0xff;
body[i++] = (frame.data.length) & 0xff;
memcpy(&body[i], frame.data.bytes, frame.data.length);
[self sendPacket:RTMP_PACKET_TYPE_VIDEO data:body size:(rtmpLength) nTimestamp:frame.timestamp];
free(body);
}
// 將數(shù)據(jù)封裝成PILI_RTMPPacket對象
- (NSInteger)sendPacket:(unsigned int)nPacketType data:(unsigned char *)data size:(NSInteger)size nTimestamp:(uint64_t)nTimestamp {
// 創(chuàng)建RTMPPacket句柄
NSInteger rtmpLength = size;
PILI_RTMPPacket rtmp_pack;
PILI_RTMPPacket_Reset(&rtmp_pack);
PILI_RTMPPacket_Alloc(&rtmp_pack, (uint32_t)rtmpLength);
rtmp_pack.m_nBodySize = (uint32_t)size; //消息長度
memcpy(rtmp_pack.m_body, data, size);
rtmp_pack.m_hasAbsTimestamp = 0; // Timestamp 是絕對值還是相對值?
rtmp_pack.m_packetType = nPacketType; //Message type ID(1-7協(xié)議控制亩歹;8匙监,9音視頻;10以后為AMF編碼消息
if (_rtmp) rtmp_pack.m_nInfoField2 = _rtmp->m_stream_id;
rtmp_pack.m_nChannel = 0x04; // 塊流id
rtmp_pack.m_headerType = RTMP_PACKET_SIZE_LARGE; // 最大數(shù)據(jù)類型
if (RTMP_PACKET_TYPE_AUDIO == nPacketType && size != 4) {
rtmp_pack.m_headerType = RTMP_PACKET_SIZE_MEDIUM;
}
rtmp_pack.m_nTimeStamp = (uint32_t)nTimestamp;
NSInteger nRet = [self RtmpPacketSend:&rtmp_pack];
PILI_RTMPPacket_Free(&rtmp_pack);
return nRet;
}
// 發(fā)送數(shù)據(jù)
- (NSInteger)RtmpPacketSend:(PILI_RTMPPacket *)packet {
if (_rtmp && PILI_RTMP_IsConnected(_rtmp)) {
// 發(fā)送數(shù)據(jù)包并返回結(jié)果
int success = PILI_RTMP_SendPacket(_rtmp, packet, 0, &_error);
return success;
}
return -1;
}
// 包裝音頻header
- (void)sendAudioHeader:(PFAudioFrame *)audioFrame {
NSInteger rtmpLength = audioFrame.audioInfo.length + 2; /*spec data長度,一般是2*/
unsigned char *body = (unsigned char *)malloc(rtmpLength);
memset(body, 0, rtmpLength);
/*AF 00 + AAC RAW data*/
body[0] = 0xAF;
body[1] = 0x00;
memcpy(&body[2], audioFrame.audioInfo.bytes, audioFrame.audioInfo.length); /*spec_buf是AAC sequence header數(shù)據(jù)*/
[self sendPacket:RTMP_PACKET_TYPE_AUDIO data:body size:rtmpLength nTimestamp:0];
free(body);
}
// 包裝音頻數(shù)據(jù)
- (void)sendAudio:(PFFrame *)frame {
NSInteger rtmpLength = frame.data.length + 2; /*spec data長度,一般是2*/
unsigned char *body = (unsigned char *)malloc(rtmpLength);
memset(body, 0, rtmpLength);
/*AF 01 + AAC RAW data*/
body[0] = 0xAF;
body[1] = 0x01;
memcpy(&body[2], frame.data.bytes, frame.data.length);
[self sendPacket:RTMP_PACKET_TYPE_AUDIO data:body size:rtmpLength nTimestamp:frame.timestamp];
free(body);
}
// 斷線重連
- (void)reconnect {
dispatch_async(self.rtmpSendQueue, ^{
// 重連次數(shù)小于reconnectCount并且正在重連中
if (self.retryTimes4netWorkBreaken++ < self.reconnectCount && !self.isReconnecting) {
self.isConnected = NO;
self.isConnecting = NO;
self.isReconnecting = YES;
dispatch_async(dispatch_get_main_queue(), ^{
// 根據(jù)設置的延遲時間再次調(diào)用重連方法
[self performSelector:@selector(_reconnect) withObject:nil afterDelay:self.reconnectInterval];
});
} else if (self.retryTimes4netWorkBreaken >= self.reconnectCount) {
// 當重連次數(shù)超過reconnectCount以后則直接返回重連失敗狀態(tài)
if (self.delegate && [self.delegate respondsToSelector:@selector(socketStatus:status:)]) {
[self.delegate socketStatus:self status:PFLiveError];
}
if (self.delegate && [self.delegate respondsToSelector:@selector(socketDidError:errorCode:)]) {
[self.delegate socketDidError:self errorCode:PFLiveSocketError_ReConnectTimeOut];
}
}
});
}
// 斷后重連
- (void)_reconnect{
[NSObject cancelPreviousPerformRequestsWithTarget:self];
_isReconnecting = NO;
if(_isConnected) return;
_isReconnecting = NO;
if (_isConnected) return;
if (_rtmp != NULL) {
PILI_RTMP_Close(_rtmp, &_error);
PILI_RTMP_Free(_rtmp);
_rtmp = NULL;
}
_sendAudioHead = NO;
_sendVideoHead = NO;
if (self.delegate && [self.delegate respondsToSelector:@selector(socketStatus:status:)]) {
[self.delegate socketStatus:self status:PFLiveRefresh];
}
if (_rtmp != NULL) {
PILI_RTMP_Close(_rtmp, &_error);
PILI_RTMP_Free(_rtmp);
}
[self RTMP264_Connect:(char *)[_stream.url cStringUsingEncoding:NSASCIIStringEncoding]];
}
#pragma mark -- CallBack
void RTMPErrorCallback(RTMPError *error, void *userData) {
PFStreamRTMPSocket *socket = (__bridge PFStreamRTMPSocket *)userData;
if (error->code < 0) {
[socket reconnect];
}
}
void ConnectionTimeCallback(PILI_CONNECTION_TIME *conn_time, void *userData) {
}
#pragma mark -- LFStreamingBufferDelegate
- (void)streamingBuffer:(nullable PFStreamingBuffer *)buffer bufferState:(PFLiveBuffferState)state{
if(self.delegate && [self.delegate respondsToSelector:@selector(socketBufferStatus:status:)]){
[self.delegate socketBufferStatus:self status:state];
}
}
- librtmp頭文件
typedef struct PILI_RTMPChunk {
int c_headerSize;
int c_chunkSize;
char *c_chunk;
char c_header[RTMP_MAX_HEADER_SIZE];
} PILI_RTMPChunk;
typedef struct PILI_RTMPPacket {
uint8_t m_headerType; // 塊頭類型
uint8_t m_packetType; // 負載格式
uint8_t m_hasAbsTimestamp; // 是否絕對時間戳
int m_nChannel; // 塊流ID
uint32_t m_nTimeStamp; // 時間戳
int32_t m_nInfoField2; // 塊流ID
uint32_t m_nBodySize; // 負載大小
uint32_t m_nBytesRead; // 讀入負載大小
PILI_RTMPChunk *m_chunk; // 在RTMP_ReadPacket()調(diào)用時小作,若該字段非NULL亭姥,表示關(guān)心原始塊的信息,通常設為NULL
char *m_body; // 負載指針
} PILI_RTMPPacket;
typedef struct PILI_RTMPSockBuf {
int sb_socket;
int sb_size; /* number of unprocessed bytes in buffer */
char *sb_start; /* pointer into sb_pBuffer of next byte to process */
char sb_buf[RTMP_BUFFER_CACHE_SIZE]; /* data read from socket */
int sb_timedout;
void *sb_ssl;
} PILI_RTMPSockBuf;
// 重置報文
void PILI_RTMPPacket_Reset(PILI_RTMPPacket *p);
void PILI_RTMPPacket_Dump(PILI_RTMPPacket *p);
// 為報文分配負載空間
int PILI_RTMPPacket_Alloc(PILI_RTMPPacket *p, int nSize);
// 釋放負載空間
void PILI_RTMPPacket_Free(PILI_RTMPPacket *p);
// 檢查報文是否可讀顾稀,當報文被分塊达罗,且接收未完成時不可讀
#define RTMPPacket_IsReady(a) ((a)->m_nBytesRead == (a)->m_nBodySize)
typedef struct PILI_RTMP_LNK {
AVal hostname;
AVal domain;
AVal sockshost;
AVal playpath0; /* parsed from URL */
AVal playpath; /* passed in explicitly */
AVal tcUrl;
AVal swfUrl;
AVal pageUrl;
AVal app;
AVal auth;
AVal flashVer;
AVal subscribepath;
AVal token;
AMFObject extras;
int edepth;
int seekTime;
int stopTime;
#define RTMP_LF_AUTH 0x0001 /* using auth param */
#define RTMP_LF_LIVE 0x0002 /* stream is live */
#define RTMP_LF_SWFV 0x0004 /* do SWF verification */
#define RTMP_LF_PLST 0x0008 /* send playlist before play */
#define RTMP_LF_BUFX 0x0010 /* toggle stream on BufferEmpty msg */
#define RTMP_LF_FTCU 0x0020 /* free tcUrl on close */
int lFlags;
int swfAge;
int protocol;
int timeout; /* connection timeout in seconds */
int send_timeout; /* send data timeout */
unsigned short socksport;
unsigned short port;
#ifdef CRYPTO
#define RTMP_SWF_HASHLEN 32
void *dh; /* for encryption */
void *rc4keyIn;
void *rc4keyOut;
uint32_t SWFSize;
uint8_t SWFHash[RTMP_SWF_HASHLEN];
char SWFVerificationResponse[RTMP_SWF_HASHLEN + 10];
#endif
} PILI_RTMP_LNK;
/* state for read() wrapper */
typedef struct PILI_RTMP_READ {
char *buf;
char *bufpos;
unsigned int buflen;
uint32_t timestamp;
uint8_t dataType;
uint8_t flags;
#define RTMP_READ_HEADER 0x01
#define RTMP_READ_RESUME 0x02
#define RTMP_READ_NO_IGNORE 0x04
#define RTMP_READ_GOTKF 0x08
#define RTMP_READ_GOTFLVK 0x10
#define RTMP_READ_SEEKING 0x20
int8_t status;
#define RTMP_READ_COMPLETE -3
#define RTMP_READ_ERROR -2
#define RTMP_READ_EOF -1
#define RTMP_READ_IGNORE 0
/* if bResume == TRUE */
uint8_t initialFrameType;
uint32_t nResumeTS;
char *metaHeader;
char *initialFrame;
uint32_t nMetaHeaderSize;
uint32_t nInitialFrameSize;
uint32_t nIgnoredFrameCounter;
uint32_t nIgnoredFlvFrameCounter;
} PILI_RTMP_READ;
typedef struct PILI_RTMP_METHOD {
AVal name;
int num;
} PILI_RTMP_METHOD;
typedef void (*PILI_RTMPErrorCallback)(RTMPError *error, void *userData);
typedef struct PILI_CONNECTION_TIME {
uint32_t connect_time;
uint32_t handshake_time;
} PILI_CONNECTION_TIME;
typedef void (*PILI_RTMP_ConnectionTimeCallback)(
PILI_CONNECTION_TIME *conn_time, void *userData);
typedef struct PILI_RTMP {
int m_inChunkSize; // 最大接收塊大小
int m_outChunkSize;// 最大發(fā)送塊大小
int m_nBWCheckCounter;// 帶寬檢測計數(shù)器
int m_nBytesIn;// 接收數(shù)據(jù)計數(shù)器
int m_nBytesInSent;// 當前數(shù)據(jù)已回應計數(shù)器
int m_nBufferMS;// 當前緩沖的時間長度,以MS為單位
int m_stream_id; // 當前連接的流ID
int m_mediaChannel;// 當前連接媒體使用的塊流ID
uint32_t m_mediaStamp;// 當前連接媒體最新的時間戳
uint32_t m_pauseStamp;// 當前連接媒體暫停時的時間戳
int m_pausing;// 是否暫停狀態(tài)
int m_nServerBW;// 服務器帶寬
int m_nClientBW;// 客戶端帶寬
uint8_t m_nClientBW2;// 客戶端帶寬調(diào)節(jié)方式
uint8_t m_bPlaying;// 當前是否推流或連接中
uint8_t m_bSendEncoding;// 連接服務器時發(fā)送編碼
uint8_t m_bSendCounter;// 設置是否向服務器發(fā)送接收字節(jié)應答
int m_numInvokes; // 0x14命令遠程過程調(diào)用計數(shù)
int m_numCalls;// 0x14命令遠程過程請求隊列數(shù)量
PILI_RTMP_METHOD *m_methodCalls; // 遠程過程調(diào)用請求隊列
PILI_RTMPPacket *m_vecChannelsIn[RTMP_CHANNELS];// 對應塊流ID上一次接收的報文
PILI_RTMPPacket *m_vecChannelsOut[RTMP_CHANNELS];// 對應塊流ID上一次發(fā)送的報文
int m_channelTimestamp[RTMP_CHANNELS]; // 對應塊流ID媒體的最新時間戳
double m_fAudioCodecs; // 音頻編碼器代碼
double m_fVideoCodecs; // 視頻編碼器代碼
double m_fEncoding; /* AMF0 or AMF3 */
double m_fDuration; // 當前媒體的時長
int m_msgCounter; // 使用HTTP協(xié)議發(fā)送請求的計數(shù)器
int m_polling;// 使用HTTP協(xié)議接收消息主體時的位置
int m_resplen;// 使用HTTP協(xié)議接收消息主體時的未讀消息計數(shù)
int m_unackd;// 使用HTTP協(xié)議處理時無響應的計數(shù)
AVal m_clientID;// 使用HTTP協(xié)議處理時的身份ID
PILI_RTMP_READ m_read;// RTMP_Read()操作的上下文
PILI_RTMPPacket m_write;// RTMP_Write()操作使用的可復用報文對象
PILI_RTMPSockBuf m_sb;// RTMP_ReadPacket()讀包操作的上下文
PILI_RTMP_LNK Link;// RTMP連接上下文
PILI_RTMPErrorCallback m_errorCallback; // rtmp鏈接斷開或者失敗后的回調(diào)
PILI_RTMP_ConnectionTimeCallback m_connCallback; // 連接超時的回調(diào)
RTMPError *m_error; //
void *m_userData;
int m_is_closing;
int m_tcp_nodelay;
uint32_t ip;
} PILI_RTMP;
// 解析流地址
int PILI_RTMP_ParseURL(const char *url, int *protocol, AVal *host,
unsigned int *port, AVal *playpath, AVal *app);
int PILI_RTMP_ParseURL2(const char *url, int *protocol, AVal *host,
unsigned int *port, AVal *playpath, AVal *app, AVal *domain);
void PILI_RTMP_ParsePlaypath(AVal *in, AVal *out);
// 連接前静秆,設置服務器發(fā)送給客戶端的媒體緩存時長
void PILI_RTMP_SetBufferMS(PILI_RTMP *r, int size);
// 連接后粮揉,更新服務器發(fā)送給客戶端的媒體緩存時長
void PILI_RTMP_UpdateBufferMS(PILI_RTMP *r, RTMPError *error);
// 更新RTMP上下文中的相應選項
int PILI_RTMP_SetOpt(PILI_RTMP *r, const AVal *opt, AVal *arg,
RTMPError *error);
// 設置流地址
int PILI_RTMP_SetupURL(PILI_RTMP *r, const char *url, RTMPError *error);
// 設置RTMP上下文播放地址和相應選項,不關(guān)心的可以設為NULL
void PILI_RTMP_SetupStream(PILI_RTMP *r, int protocol, AVal *hostname,
unsigned int port, AVal *sockshost, AVal *playpath,
AVal *tcUrl, AVal *swfUrl, AVal *pageUrl, AVal *app,
AVal *auth, AVal *swfSHA256Hash, uint32_t swfSize,
AVal *flashVer, AVal *subscribepath, int dStart,
int dStop, int bLiveStream, long int timeout);
// 客戶端連接及握手
int PILI_RTMP_Connect(PILI_RTMP *r, PILI_RTMPPacket *cp, RTMPError *error);
struct sockaddr;
int PILI_RTMP_Connect0(PILI_RTMP *r, struct addrinfo *ai, unsigned short port,
RTMPError *error);
int PILI_RTMP_Connect1(PILI_RTMP *r, PILI_RTMPPacket *cp, RTMPError *error);
// 服務端握手
int PILI_RTMP_Serve(PILI_RTMP *r, RTMPError *error);
// 接收一個報文
int PILI_RTMP_ReadPacket(PILI_RTMP *r, PILI_RTMPPacket *packet);
// 發(fā)送一個報文抚笔,queue為1表示當包類型為0x14時扶认,將加入隊列等待響應
int PILI_RTMP_SendPacket(PILI_RTMP *r, PILI_RTMPPacket *packet, int queue,
RTMPError *error);
// 直接發(fā)送塊
int PILI_RTMP_SendChunk(PILI_RTMP *r, PILI_RTMPChunk *chunk, RTMPError *error);
// 檢查網(wǎng)絡是否連接
int PILI_RTMP_IsConnected(PILI_RTMP *r);
// 返回套接字
int PILI_RTMP_Socket(PILI_RTMP *r);
// 檢查連接是否超時
int PILI_RTMP_IsTimedout(PILI_RTMP *r);
// 獲取當前媒體的時長
double PILI_RTMP_GetDuration(PILI_RTMP *r);
// 暫停與播放切換控制
int PILI_RTMP_ToggleStream(PILI_RTMP *r, RTMPError *error);
// 連接流,并指定開始播放的位置
int PILI_RTMP_ConnectStream(PILI_RTMP *r, int seekTime, RTMPError *error);
// 重新創(chuàng)建流
int PILI_RTMP_ReconnectStream(PILI_RTMP *r, int seekTime, RTMPError *error);
// 刪除當前流
void PILI_RTMP_DeleteStream(PILI_RTMP *r, RTMPError *error);
// 獲取第一個媒體包
int PILI_RTMP_GetNextMediaPacket(PILI_RTMP *r, PILI_RTMPPacket *packet);
// 處理客戶端的報文交互殊橙,即處理報文分派邏輯
int PILI_RTMP_ClientPacket(PILI_RTMP *r, PILI_RTMPPacket *packet);
// 初使化RTMP上下文辐宾,設默認值
void PILI_RTMP_Init(PILI_RTMP *r);
// 關(guān)閉RTMP上下文
void PILI_RTMP_Close(PILI_RTMP *r, RTMPError *error);
// 分配RTMP上下文
PILI_RTMP *PILI_RTMP_Alloc(void);
// 釋放RTMP上下文
void PILI_RTMP_Free(PILI_RTMP *r);
// 開啟客戶端的RTMP寫開關(guān),用于推流
void PILI_RTMP_EnableWrite(PILI_RTMP *r);
// 返回RTMP的版本
int PILI_RTMP_LibVersion(void);
// 開啟RTMP工作中斷
void PILI_RTMP_UserInterrupt(void); /* user typed Ctrl-C */
// 發(fā)送0x04號命令的控制消息
int PILI_RTMP_SendCtrl(PILI_RTMP *r, short nType, unsigned int nObject,
unsigned int nTime, RTMPError *error);
/* caller probably doesn't know current timestamp, should
* just use RTMP_Pause instead
*/
// 發(fā)送0x14號遠程調(diào)用控制暫停
int PILI_RTMP_SendPause(PILI_RTMP *r, int DoPause, int dTime, RTMPError *error);
int PILI_RTMP_Pause(PILI_RTMP *r, int DoPause, RTMPError *error);
// 遞歸在一個對象中搜索指定的屬性
int PILI_RTMP_FindFirstMatchingProperty(AMFObject *obj, const AVal *name,
AMFObjectProperty *p);
// 底層套接口的網(wǎng)絡讀取膨蛮、發(fā)送螃概、關(guān)閉連接操作
int PILI_RTMPSockBuf_Fill(PILI_RTMPSockBuf *sb);
int PILI_RTMPSockBuf_Send(PILI_RTMPSockBuf *sb, const char *buf, int len);
int PILI_RTMPSockBuf_Close(PILI_RTMPSockBuf *sb);
// 發(fā)送建流操作
int PILI_RTMP_SendCreateStream(PILI_RTMP *r, RTMPError *error);
// 發(fā)送媒體時間定位操作
int PILI_RTMP_SendSeek(PILI_RTMP *r, int dTime, RTMPError *error);
// 發(fā)送設置服務器應答窗口大小操作
int PILI_RTMP_SendServerBW(PILI_RTMP *r, RTMPError *error);
// 發(fā)送設置服務器輸出帶寬操作
int PILI_RTMP_SendClientBW(PILI_RTMP *r, RTMPError *error);
// 刪除0x14命令遠程調(diào)用隊列中的請求
void PILI_RTMP_DropRequest(PILI_RTMP *r, int i, int freeit);
// 讀取FLV格式數(shù)據(jù)
int PILI_RTMP_Read(PILI_RTMP *r, char *buf, int size);
// 發(fā)送FLV格式數(shù)據(jù)
int PILI_RTMP_Write(PILI_RTMP *r, const char *buf, int size, RTMPError *error);
/* hashswf.c */
int PILI_RTMP_HashSWF(const char *url, unsigned int *size, unsigned char *hash,
int age);
服務搭建
rtmp服務搭建網(wǎng)上相關(guān)文章太多了,我就不bb了鸽疾,我中間碰到的一個連接失敗的問題吊洼,把防火墻關(guān)掉就可以了。
播放端
播放端直接使用的bilibili的ijkPlayer制肮,關(guān)于編譯問題網(wǎng)上文章也很多冒窍,我直接按照github上的操作一次編譯成功了递沪,需要導入的庫除了github上提示的額外又添加了一個libc++.tbd,播放代碼可以到這里查看综液。
采集推流項目地址
ijkPlayer播放器項目地址
參考文章:
使用librtmp庫進行推流與拉流