【目錄】
- 如何開(kāi)發(fā)出一款仿映客直播APP項(xiàng)目實(shí)踐篇 -【原理篇】
- 如何開(kāi)發(fā)出一款仿映客直播APP項(xiàng)目實(shí)踐篇 -【采集篇 】
- 如何開(kāi)發(fā)出一款仿映客直播APP項(xiàng)目實(shí)踐篇 -【服務(wù)器搭建+推流】
- 如何開(kāi)發(fā)出一款仿映客直播APP項(xiàng)目實(shí)踐篇 -【播放篇】
【采集基本原理】
采集: 硬件(攝像頭)視頻圖像
推流: 就是將采集到的音頻擎勘,視頻數(shù)據(jù)通過(guò)流媒體協(xié)議發(fā)送到流媒體服務(wù)器。
推流前的工作:采集入偷,處理呵晚,編碼壓縮
推流中做的工作: 封裝,上傳
推流前的工作
推流——采集到的音頻,視頻數(shù)據(jù)通過(guò)流媒體協(xié)議發(fā)送到流媒體服務(wù)器
【視頻采集】
- 方法一:利用封裝庫(kù)LFLiveKit(推薦)
- 方法二:利用系統(tǒng)庫(kù)AVFoundation
接下來(lái)隧出,我會(huì)分別貼上兩種方法代碼
** 其實(shí) LFLiveKit 已經(jīng)實(shí)現(xiàn)了 后臺(tái)錄制、美顏功能阀捅、支持h264胀瞪、AAC硬編碼,動(dòng)態(tài)改變速率饲鄙,RTMP傳輸?shù)绕嗟瑢?duì)AVFoundation庫(kù)進(jìn)行了封裝,我們真正開(kāi)發(fā)的時(shí)候直接使用就很方便啦忍级。**另外也有:
LiveVideoCoreSDK : 實(shí)現(xiàn)了美顏直播和濾鏡功能帆谍,我們只要填寫RTMP服務(wù)地址,直接就可以進(jìn)行推流啦轴咱。
PLCameraStreamingKit: 也是一個(gè)不錯(cuò)的 RTMP 直播推流 SDK汛蝙。
雖然推薦用 LFLiveKit 已包含采集、美顏朴肺、編碼窖剑、推流等功能,而為了進(jìn)一步了解采集到推流完整過(guò)程戈稿,可以參觀方法二代碼按自己的步子試著走走西土,詳細(xì)講解每個(gè)流程的原理。
方法一鞍盗、利用LFLiveKit
xib上添加兩個(gè)Button 和一個(gè)Label (主要監(jiān)聽(tīng)連接狀態(tài))
2.創(chuàng)建CaputuereLiveViewController.m類 注釋都寫在文檔中
// CaputuereLiveViewController.m
// ZKKLiveAPP
//
// Created by Kevin on 16/11/12.
// Copyright ? 2016年 zhangkk. All rights reserved.
//
#import "CaputuereLiveViewController.h"
#import <LFLiveKit/LFLiveKit.h>
@interface CaputuereLiveViewController ()<LFLiveSessionDelegate>{
LFLiveSession *_session;
}
//總控制對(duì)象
@property(nonatomic,strong)LFLiveSession *session;
// 推流狀態(tài)(下一篇推流時(shí)用到的)
@property (weak, nonatomic) IBOutlet UILabel *linkStatusLb;
//美顏
@property (weak, nonatomic) IBOutlet UIButton *beautyBtn;
- (IBAction)beautyBtn:(UIButton *)sender;
//切換攝像頭
@property (weak, nonatomic) IBOutlet UIButton *changCamreBtn;
- (IBAction)changCamreBtn:(UIButton *)sender;
- (IBAction)backBtn:(UIButton *)sender;
@end
@implementation CaputuereLiveViewController
-(void )viewWillAppear:(BOOL)animated{
[super viewWillAppear:YES];
[UIApplication sharedApplication].statusBarHidden = YES;
self.tabBarController.tabBar.hidden = YES;
self.hidesBottomBarWhenPushed = YES;
[self requestAccessForVideo];//請(qǐng)求視頻采集權(quán)限
[self requestAccessForAudio];//請(qǐng)求音頻權(quán)限
//開(kāi)始錄制
[self startLive];
}
- (void)viewDidLoad {
[super viewDidLoad];
self.view.backgroundColor= [UIColor clearColor];
}
-(void)viewWillDisappear:(BOOL)animated{
[super viewWillDisappear:YES];
[self stopLive];
}
#pragma mark -- Public Method
-(void)requestAccessForVideo{
__weak typeof(self) _self = self;
AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
switch (status) {
case AVAuthorizationStatusNotDetermined:
{
//許可對(duì)話沒(méi)有出現(xiàn) 則設(shè)置請(qǐng)求
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {
if(granted){
dispatch_async(dispatch_get_main_queue(), ^{
[_self.session setRunning:YES];
});
}
}];
break;
}
case AVAuthorizationStatusAuthorized:
{
dispatch_async(dispatch_get_main_queue(), ^{
[_self.session setRunning:YES];
});
break;
}
case AVAuthorizationStatusDenied:
case AVAuthorizationStatusRestricted:
//用戶獲取失敗
break;
default:
break;
}
}
-(void)requestAccessForAudio{
AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeAudio];
switch (status) {
case AVAuthorizationStatusNotDetermined:{
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeAudio completionHandler:^(BOOL granted) {
}];
}
break;
case AVAuthorizationStatusAuthorized:
break;
case AVAuthorizationStatusRestricted:
case AVAuthorizationStatusDenied:
break;
default:
break;
}
}
#pragma mark -- LFStreamingSessionDelegate
/**
鏈接狀態(tài)
*/
-(void)liveSession:(LFLiveSession *)session liveStateDidChange:(LFLiveState)state{
switch (state) {
case LFLiveReady:
_linkStatusLb.text = @"未連接";
break;
case LFLivePending:
_linkStatusLb.text = @"連接中...";
break;
case LFLiveStart:
_linkStatusLb.text = @"開(kāi)始連接";
break;
case LFLiveStop:
_linkStatusLb.text = @"斷開(kāi)連接";
break;
case LFLiveError:
_linkStatusLb.text = @"連接錯(cuò)誤";
default:
break;
}
}
/*dug CallBack*/
-(void)liveSession:(LFLiveSession *)session debugInfo:(LFLiveDebug *)debugInfo{
NSLog(@"bugInfo:%@",debugInfo);
}
/** callback socket errorcode */
- (void)liveSession:(nullable LFLiveSession *)session errorCode:(LFLiveSocketErrorCode)errorCode {
NSLog(@"errorCode: %ld", errorCode);
}
/**
**Live
*/
-(void )startLive{
LFLiveStreamInfo *stream = [LFLiveStreamInfo new];
/*stream.url = @"rtmp://192.168.0.2:1990/liveApp/room";
[self.session startLive:stream];*/后續(xù)推流時(shí)使用
}
-(void)stopLive{
[self.session stopLive];
}
- (LFLiveSession*)session {
if (!_session) {
_session = [[LFLiveSession alloc] initWithAudioConfiguration:[LFLiveAudioConfiguration defaultConfiguration] videoConfiguration:[LFLiveVideoConfiguration defaultConfiguration]];
_session.preView = self.view;//將攝像頭采集數(shù)據(jù)源渲染到view上
_session.delegate = self;
}
return _session;
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
/**
**Action 美顏/切換前后攝像頭
@param sender button
*/
- (IBAction)beautyBtn:(UIButton *)sender {
sender.selected = !sender.selected;
self.session.beautyFace = !self.session.beautyFace;
}
- (IBAction)changCamreBtn:(UIButton *)sender {
AVCaptureDevicePosition position = self.session.captureDevicePosition;
self.session.captureDevicePosition = (position == AVCaptureDevicePositionBack)?AVCaptureDevicePositionBack:AVCaptureDevicePositionFront;
}
- (IBAction)backBtn:(UIButton *)sender {
NSLog(@"返回");
// self.view.window.rootViewController = self.tabBarController;
[self.tabBarController setSelectedIndex:0];
self.tabBarController.tabBar.hidden = NO;
}
*/
@end
方法二需了、利用系統(tǒng)AVFoundation采集視頻
一跳昼、采集硬件(攝像頭)視頻圖像
#import "CaputureViewController.h"
#import <AVFoundation/AVFoundation.h>
#import "GPUImageBeautifyFilter.h"
@interface CaputureViewController ()<AVCaptureVideoDataOutputSampleBufferDelegate,AVCaptureAudioDataOutputSampleBufferDelegate>
/**采集視頻*/
//切換屏幕按鈕
@property (weak, nonatomic) IBOutlet UIButton *changScreenBtn;
//采集視頻總控制
@property(nonatomic,strong)AVCaptureSession *captureSession;
//視頻采集輸入數(shù)據(jù)源
@property(nonatomic,strong)AVCaptureDeviceInput *currentVideoDeviceInput;
//將攝像頭采集數(shù)據(jù)源顯示在屏幕上
@property(nonatomic,weak)AVCaptureVideoPreviewLayer *previedLayer;
//采集的截取數(shù)據(jù)流 一般用與美顏等處理
@property(nonatomic,weak)AVCaptureConnection *videoConnection;
- (IBAction)changScreenBtn:(UIButton *)sender;
/*開(kāi)啟美顏*/
@property (weak, nonatomic) IBOutlet UISwitch *openBeautySwitch;
- (IBAction)switch:(UISwitch *)sender;
//@property(nonatomic,)BOOL isOpenBeauty;
//@property(nonatomic,strong)<#type#> *<#Name#>;
@end
@implementation CaputureViewController
-(void)viewWillAppear:(BOOL)animated{
[super viewWillAppear:YES];
if (_captureSession) {
[_captureSession startRunning];
}
}
- (void)viewDidLoad {
[super viewDidLoad];
[self.view addSubview:self.focusCursorImageView];
self.view.backgroundColor = [UIColor whiteColor];
/*1. 采集視頻 -avfoundation */
[self setupCaputureVideo];
/*2. GPUImage 美顏視圖 */
}
- (void)viewWillDisappear:(BOOL)animated{
[super viewWillDisappear:YES];
if (_captureSession) {
[_captureSession stopRunning];
}
}
/**
音視頻捕獲
*/
-(void)setupCaputureVideo{
//創(chuàng)建管理對(duì)象
_captureSession = [[AVCaptureSession alloc]init];
//獲取攝像頭和音頻
// AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *videoDevice = [self getVideoDevice:AVCaptureDevicePositionFront];
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
//創(chuàng)建對(duì)應(yīng)音視頻設(shè)備輸入對(duì)象
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
AVCaptureDeviceInput * audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
_currentVideoDeviceInput = videoDeviceInput;
if ([_captureSession canAddInput:_currentVideoDeviceInput]) {
[_captureSession addInput:_currentVideoDeviceInput];
}
if ([_captureSession canAddInput:audioDeviceInput]) {
[_captureSession canAddInput:audioDeviceInput];
}
//獲取系統(tǒng)輸出的視頻源
AVCaptureVideoDataOutput *videoOutput = [[AVCaptureVideoDataOutput alloc]init];
AVCaptureAudioDataOutput *audioOutput = [[AVCaptureAudioDataOutput alloc]init];
//串行對(duì)列
dispatch_queue_t videoQueue = dispatch_queue_create("VideoQueue",DISPATCH_QUEUE_SERIAL);
dispatch_queue_t audioQueue = dispatch_queue_create("audioQueue", DISPATCH_QUEUE_SERIAL);
[videoOutput setSampleBufferDelegate:self queue:videoQueue];
[audioOutput setSampleBufferDelegate:self queue:audioQueue];
videoOutput.videoSettings = @{(NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)};
// _videoOutput.videoSettings = captureSettings;
//添加輸出源 到控制類session中
if ([_captureSession canAddOutput:videoOutput]) {
[_captureSession addOutput: videoOutput];
}
if ([_captureSession canAddOutput:audioOutput]) {
[_captureSession addOutput:audioOutput];
}
//獲取視頻輸入和輸出的鏈接 用于分辨音視頻數(shù)據(jù) 做處理時(shí)用到
_videoConnection = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
//將視屏數(shù)據(jù)加入視圖層 顯示
AVCaptureVideoPreviewLayer *previedLayer = [AVCaptureVideoPreviewLayer layerWithSession:_captureSession];
previedLayer.frame = [UIScreen mainScreen].bounds;
[self.view.layer insertSublayer:previedLayer atIndex:0];
[self.view.layer insertSublayer:_changScreenBtn.layer atIndex:1];
_previedLayer = previedLayer;
[_captureSession startRunning];
}
//獲取切換后的攝像頭
- (IBAction)changScreenBtn:(UIButton *)sender {
//獲取當(dāng)前的攝像頭
AVCaptureDevicePosition curPosition = _currentVideoDeviceInput.device.position;
//獲取改變的方向
AVCaptureDevicePosition togglePosition = curPosition == AVCaptureDevicePositionFront?AVCaptureDevicePositionBack:AVCaptureDevicePositionFront;
//獲取當(dāng)前的攝像頭
AVCaptureDevice *toggleDevice = [self getVideoDevice:togglePosition];
//切換輸入設(shè)備
AVCaptureDeviceInput *toggleDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:toggleDevice error:nil];
[_captureSession removeInput:_currentVideoDeviceInput];
[_captureSession addInput:toggleDeviceInput];
_currentVideoDeviceInput = toggleDeviceInput;
}
-(AVCaptureDevice *)getVideoDevice:(AVCaptureDevicePosition)position {
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for( AVCaptureDevice *device in devices) {
if (device .position == position) {
return device;
}
}
return nil;
}
-(UIImageView *)focusCursorImageView{
if (!_focusCursorImageView) {
_focusCursorImageView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:@"focus"]];
}
return _focusCursorImageView;
}
#pragma mark - AVCaptureVideoDataOutputSampleBufferDelegate
//截取輸出的視頻數(shù)據(jù)
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
if (_videoConnection == connection) {
NSLog(@"采集的視頻數(shù)據(jù)");
/*美顏*/
}else{
NSLog(@"采集的音頻數(shù)據(jù)");
}
}
上述是大致實(shí)現(xiàn)獲取最基本數(shù)據(jù)的情況,一些細(xì)節(jié)(尺寸援所、方向)暫時(shí)沒(méi)有深入庐舟,真正做直播的時(shí)候,一般是視頻和音頻是分開(kāi)處理的住拭,只有重點(diǎn)注意那個(gè)代理方法挪略。
二、GPUImage 處理
在進(jìn)行編碼 H.264 之前滔岳,一般來(lái)說(shuō)肯定會(huì)做一些美顏處理的杠娱,否則那播出的感覺(jué)太真實(shí),就有點(diǎn)丑啦谱煤,在此以磨皮和美白為例簡(jiǎn)單了解摊求。(具體參考的是:琨君 基于 GPUImage 的實(shí)時(shí)美顏濾鏡)
直接用 BeautifyFaceDemo 中的類 GPUImageBeautifyFilter
, 可以對(duì)的圖片直接進(jìn)行處理:
GPUImageBeautifyFilter *filter = [[GPUImageBeautifyFilter alloc] init];
UIImage *image = [UIImage imageNamed:@"testMan"];
UIImage *resultImage = [filterimageByFilteringImage:image];
self.backgroundView.image = resultImage;
但是視頻中是怎樣進(jìn)行美容處理呢?怎樣將其轉(zhuǎn)換的呢刘离?平常我們這樣直接使用:
GPUImageBeautifyFilter *beautifyFilter = [[GPUImageBeautifyFilter alloc] init];[self.videoCamera addTarget:beautifyFilter];[beautifyFilter addTarget:self.gpuImageView];
此處用到了 GPUImageVideoCamera室叉,可以大致了解下 GPUImage詳細(xì)解析(三)- 實(shí)時(shí)美顏濾鏡:
GPUImageVideoCamera: GPUImageOutput的子類,提供來(lái)自攝像頭的圖像數(shù)據(jù)作為源數(shù)據(jù)硫惕,一般是響應(yīng)鏈的源頭茧痕。
GPUImageView:響應(yīng)鏈的終點(diǎn),一般用于顯示GPUImage的圖像恼除。
GPUImageFilter:用來(lái)接收源圖像踪旷,通過(guò)自定義的頂點(diǎn)、片元著色器來(lái)渲染新的圖像豁辉,并在繪制完成后通知響應(yīng)鏈的下一個(gè)對(duì)象令野。
GPUImageFilterGroup:多個(gè)GPUImageFilter的集合。
GPUImageBeautifyFilter:
@interface GPUImageBeautifyFilter : GPUImageFilterGroup {
GPUImageBilateralFilter *bilateralFilter;
GPUImageCannyEdgeDetectionFilter *cannyEdgeFilter;
GPUImageCombinationFilter *combinationFilter;
GPUImageHSBFilter *hsbFilter;
}
簡(jiǎn)單理解這個(gè)美顏的流程
不得不說(shuō)GPUImage 是相當(dāng)強(qiáng)大的徽级,此處的功能也只是顯現(xiàn)了一小部分气破,其中 filter 那塊的處理個(gè)人目前還有好多不理解,需要去深入了解啃源碼餐抢,暫時(shí)不過(guò)多引入现使。通過(guò)這個(gè)過(guò)程將 sampleBuffer 美容處理后,自然是進(jìn)行編碼啦弹澎。
三朴下、視頻、音頻壓縮編碼
而編碼是用 硬編碼呢 還是軟編碼呢苦蒿? 相同碼率殴胧,軟編圖像質(zhì)量更清晰,但是耗電更高,而且會(huì)導(dǎo)致CPU過(guò)熱燙到攝像頭团滥。不過(guò)硬編碼會(huì)涉及到其他平臺(tái)的解碼竿屹,有很多坑。綜合來(lái)說(shuō)灸姊,iOS 端硬件兼容性較好拱燃,iOS 8.0占有率也已經(jīng)很高了,可以直接采用硬編力惯。
硬編碼:下面幾個(gè)DEMO 可以對(duì)比下碗誉,當(dāng)然看 LFLiveKit 更直接。
VideoToolboxPlus
iOSHardwareDecoder
-VideoToolboxDemo
iOS-h264Hw-Toolbox
四父晶、推流
封裝數(shù)據(jù)成 FLV哮缺,通過(guò) RTMP 協(xié)議打包上傳,從主播端到服務(wù)端即基本完成推流甲喝。
4-1尝苇、封裝數(shù)據(jù)通常是封裝成 FLV
FLV流媒體格式是一種新的視頻格式,全稱為FlashVideo埠胖。由于它形成的文件極小糠溜、加載速度極快,使得網(wǎng)絡(luò)觀看視頻文件成為可能直撤,它的出現(xiàn)有效地解決了視頻文件導(dǎo)入Flash后非竿,使導(dǎo)出的SWF文件體積龐大,不能在網(wǎng)絡(luò)上很好的使用等缺點(diǎn)谊惭。
(封包 FLV):一般FLV 文件結(jié)構(gòu)里是這樣存放的:
[[Flv Header]
[Metainfo Tag]
[Video Tag]
[Audio Tag]
[Video Tag]
[Audio Tag]
[Other Tag]…]
其中 AudioTag 和 VideoTag 出現(xiàn)的順序隨機(jī)的汽馋,沒(méi)有嚴(yán)格的定義侮东。Flv Header 是文件的頭部圈盔,用FLV字符串標(biāo)明了文件的類型,以及是否有音頻悄雅、視頻等信息驱敲。之后會(huì)有幾個(gè)字節(jié)告訴接下來(lái)的包字節(jié)數(shù)。Metainfo 中用來(lái)描述Flv中的各種參數(shù)信息宽闲,例如視頻的編碼格式众眨、分辨率、采樣率等等容诬。如果是本地文件(非實(shí)時(shí)直播流)娩梨,還會(huì)有偏移時(shí)間戳之類的信息用于支持快進(jìn)等操作。VideoTag 存放視頻數(shù)據(jù)览徒。對(duì)于H.264來(lái)說(shuō)狈定,第一幀發(fā)送的NALU應(yīng)為 SPS和PPS,這個(gè)相當(dāng)于H.264的文件頭部,播放器解碼流必須先要找到這個(gè)才能進(jìn)行播放纽什。之后的數(shù)據(jù)為I幀或P幀措嵌。AudioTag 存放音頻數(shù)據(jù)。對(duì)于AAC來(lái)說(shuō)芦缰,我們只需要在每次硬編碼完成后給數(shù)據(jù)加上adts頭部信息即可企巢。
iOS 中的使用:詳細(xì)看看 LFLiveKit 中的 LFStreamRTMPSocket 類。
總的說(shuō)來(lái)让蕾,這又是一個(gè)粗略的過(guò)程浪规,站在好多個(gè)巨人的肩膀上,但是還是基本了解了一個(gè)推流的流程探孝,沒(méi)有正式項(xiàng)目的經(jīng)驗(yàn)罗丰,肯定有太很多細(xì)節(jié)點(diǎn)忽略了和好多坑需要填,還是那個(gè)目的再姑,暫時(shí)先作為自己的預(yù)備知識(shí)點(diǎn)吧萌抵,不過(guò)此處可以擴(kuò)展和深入的知識(shí)點(diǎn)真的太多啦,如 LFLiveKit 和 GPUImage 僅僅展露的是冰山一角元镀。
代碼地址:
gitHub : https://github.com/one-tea/ZKKLiveDemo
備注參考:
LiveVideoCoreSDK
LFLiveKit
GPUImage
LMLiveStreaming
PLCameraStreamingKit
iOS手機(jī)直播Demo技術(shù)簡(jiǎn)介
iOS視頻開(kāi)發(fā)經(jīng)驗(yàn)
iOS 上的相機(jī)捕捉
CMSampleBufferRef 與 UIImage 的轉(zhuǎn)換
GPUImage詳細(xì)解析(三)- 實(shí)時(shí)美顏濾鏡
iOS8系統(tǒng)H264視頻硬件編解碼說(shuō)明
利用FFmpeg+x264將iOS攝像頭實(shí)時(shí)視頻流編碼為h264文件
使用VideoToolbox硬編碼H.264
使用iOS自帶AAC編碼器
如何搭建一個(gè)完整的視頻直播系統(tǒng)绍填?
直播中累積延時(shí)的優(yōu)化
使用VLC做流媒體服務(wù)器(直播形式)
gitHub代碼地址
Object-C版 : https://github.com/one-tea/ZKKLiveDemo
Swift版 : https://github.com/one-tea/ZKKLiveAPP_Swift3.0