概述
在直播應(yīng)用中曼尊,視頻的采集一般都是用AVFoundation框架盟榴,因為利用它我們能定制采集視頻的參數(shù)曹质;也能做切換手機攝像頭、拍照擎场、打開手電筒等一些列相機的操作羽德;當(dāng)然,更重要的一點是我們能獲取到原始視頻數(shù)據(jù)用來做編碼等操作顶籽。這篇文章我們介紹的內(nèi)容如下:
- 介紹和視頻采集相關(guān)的關(guān)鍵類
- 介紹視頻采集的步驟
- 介紹如何改變視頻采集的參數(shù)玩般,例如:分辨率,幀率礼饱,放大&縮小預(yù)覽層坏为,設(shè)置曝光等究驴。
- 詳細(xì)介紹相機操作,例如:拍照匀伏、切換前后鏡頭洒忧、打開&關(guān)閉手電筒等操作。
代碼:
- github
- 歡迎fork&star
視頻采集的關(guān)鍵類
AVCaptureDevice
它表示硬件設(shè)備够颠,我們可以從這個類中獲取手機硬件的照相機熙侍,聲音傳感器等。當(dāng)我們需要改變一些硬件設(shè)備的屬性時(例如:閃光模式改變履磨,相機聚焦改變等)蛉抓,必須要在改變設(shè)備屬性之前調(diào)用lockForConfiguration
為設(shè)備加鎖,改變完成后調(diào)用unlockForConfiguration
方法解鎖設(shè)備剃诅。
AVCaptureDeviceInput
輸入設(shè)備管理對象巷送,可以根據(jù)AVCaptureDevice創(chuàng)建創(chuàng)建對應(yīng)的AVCaptureDeviceInput對象,該對象會被添加到AVCaptureSession中管理矛辕。它代表輸入設(shè)備笑跛,它配置硬件設(shè)備的ports,通常的輸入設(shè)備有(麥克風(fēng),相機等)聊品。
AVCaptureOutput
代表輸出數(shù)據(jù)飞蹂,輸出的可以是圖片(AVCaptureStillImageOutput)或者視頻(AVCaptureMovieFileOutput)
AVCaptureSession
媒體捕捉會話,負(fù)責(zé)把捕捉的音視頻數(shù)據(jù)輸出到輸出設(shè)備中翻屈。一個AVCaptureSession可以有多個輸入或輸出陈哑。它是連接AVCaptureInput和AVCaptureOutput的橋梁,它協(xié)調(diào)input到output之間傳輸數(shù)據(jù)伸眶。它用startRunning和stopRunning兩種方法來開啟和結(jié)束會話芥颈。
每個session稱之為一個會話,也就是在應(yīng)用運行過程中如果需要改變會話的一些配置(eg:切換攝像頭),此時需要先開啟配置赚抡,配置完成之后再提交配置。
AVCaptureConnection
AVCaptureConnection represents a connection between an AVCaptureInputPort or ports, and an AVCaptureOutput or AVCaptureVideoPreviewLayer present in an AVCaptureSession.即它是一個連接纠屋,這個連接是inputPort和output之間或者是圖像當(dāng)前預(yù)覽層和當(dāng)前會話之間的涂臣。
AVCaptureVideoPreviewPlayer
它是圖片預(yù)覽層。我們的照片以及視頻是如何顯示在手機上的呢售担?那就是通過把這個對象添加到UIView 的layer上的赁遗。
視頻采集的步驟
以下是視頻采集的代碼,幀率是30FPS
族铆,分辨率是1920*1080
岩四。
#import "MiVideoCollectVC.h"
#import <AVFoundation/AVFoundation.h>
@interface MiVideoCollectVC ()<AVCaptureVideoDataOutputSampleBufferDelegate>
@property (nonatomic,strong) AVCaptureVideoDataOutput *video_output;
@property (nonatomic,strong) AVCaptureSession *m_session;
@property (weak, nonatomic) IBOutlet UIView *m_displayView;
@end
@implementation MiVideoCollectVC
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view.
[self startCaptureSession];
}
- (void)viewWillAppear:(BOOL)animated
{
[super viewWillAppear:animated];
[self startPreview];
}
- (IBAction)onpressedBtnDismiss:(id)sender {
[self dismissViewControllerAnimated:YES completion:^{
[self stopPreview];
}];
}
- (void)startCaptureSession
{
NSError *error = nil;
AVCaptureSession *session = [[AVCaptureSession alloc] init];
if ([session canSetSessionPreset:AVCaptureSessionPreset1920x1080]) {
session.sessionPreset = AVCaptureSessionPreset1920x1080;
}else{
session.sessionPreset = AVCaptureSessionPresetHigh;
}
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (error || !input) {
NSLog(@"get input device error...");
return;
}
[session addInput:input];
_video_output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:_video_output];
// Specify the pixel format
_video_output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
_video_output.alwaysDiscardsLateVideoFrames = NO;
dispatch_queue_t video_queue = dispatch_queue_create("MIVideoQueue", NULL);
[_video_output setSampleBufferDelegate:self queue:video_queue];
CMTime frameDuration = CMTimeMake(1, 30);
BOOL frameRateSupported = NO;
for (AVFrameRateRange *range in [device.activeFormat videoSupportedFrameRateRanges]) {
if (CMTIME_COMPARE_INLINE(frameDuration, >=, range.minFrameDuration) &&
CMTIME_COMPARE_INLINE(frameDuration, <=, range.maxFrameDuration)) {
frameRateSupported = YES;
}
}
if (frameRateSupported && [device lockForConfiguration:&error]) {
[device setActiveVideoMaxFrameDuration:frameDuration];
[device setActiveVideoMinFrameDuration:frameDuration];
[device unlockForConfiguration];
}
[self adjustVideoStabilization];
_m_session = session;
CALayer *previewViewLayer = [self.m_displayView layer];
previewViewLayer.backgroundColor = [[UIColor blackColor] CGColor];
AVCaptureVideoPreviewLayer *newPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_m_session];
[newPreviewLayer setFrame:[UIApplication sharedApplication].keyWindow.bounds];
[newPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
// [previewViewLayer insertSublayer:newPreviewLayer atIndex:2];
[previewViewLayer insertSublayer:newPreviewLayer atIndex:0];
}
- (void)adjustVideoStabilization
{
NSArray *devices = [AVCaptureDevice devices];
for (AVCaptureDevice *device in devices) {
if ([device hasMediaType:AVMediaTypeVideo]) {
if ([device.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeAuto]) {
for (AVCaptureConnection *connection in _video_output.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
if (connection.supportsVideoStabilization) {
connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeStandard;
NSLog(@"now videoStabilizationMode = %ld",(long)connection.activeVideoStabilizationMode);
}else{
NSLog(@"connection does not support video stablization");
}
}
}
}
}else{
NSLog(@"device does not support video stablization");
}
}
}
}
- (void)startPreview
{
if (![_m_session isRunning]) {
[_m_session startRunning];
}
}
- (void)stopPreview
{
if ([_m_session isRunning]) {
[_m_session stopRunning];
}
}
#pragma mark -AVCaptureVideoDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
NSLog(@"%s",__func__);
}
// 有丟幀時,此代理方法會觸發(fā)
- (void)captureOutput:(AVCaptureOutput *)output didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
NSLog(@"MediaIOS: 丟幀...");
}
@end
視頻采集的具體步驟總結(jié)如下:
- 首先創(chuàng)建一個AVCaptureSession對象哥攘,并且為該對象輸入設(shè)備和輸出設(shè)備并把輸入輸出設(shè)備添加到AVCaptrueSession對象剖煌。
- 為AVCaptureSession設(shè)置視頻分辨率
- 設(shè)置視頻采集的幀率
- 創(chuàng)建視頻預(yù)覽層并插入到view的layer中
改變視頻采集參數(shù)-分辨率和幀率
我們先不介紹如何改變視頻的分辨率和幀率材鹦,我們首先來講一下如何監(jiān)控視頻采集的這些參數(shù),因為我們只有能監(jiān)控到這些參數(shù)的變化才能知道我們對這些參數(shù)的設(shè)置是否成功耕姊。
監(jiān)控視頻分辨率:
我們可以通過AVCaptureSession對象的sessionPreset
直接獲取到桶唐,它是一個字符串,我們設(shè)置完成之后直接打印一下就可以了茉兰。
監(jiān)控視頻幀率:
視頻的幀率表示的是每秒采集的視頻幀數(shù)尤泽,我們可以通過啟動一個timer(1s刷新一次),來實時打印當(dāng)前采集的視頻幀率是多少规脸。下面是計算1s內(nèi)采集視頻幀數(shù)的代碼:
// 計算每秒鐘采集視頻多少幀
static int captureVideoFPS;
+ (void)calculatorCaptureFPS
{
static int count = 0;
static float lastTime = 0;
CMClockRef hostClockRef = CMClockGetHostTimeClock();
CMTime hostTime = CMClockGetTime(hostClockRef);
float nowTime = CMTimeGetSeconds(hostTime);
if(nowTime - lastTime >= 1)
{
captureVideoFPS = count;
lastTime = nowTime;
count = 0;
}
else
{
count ++;
}
}
// 獲取視頻幀率
+ (int)getCaptureVideoFPS
{
return captureVideoFPS;
}
改變分辨率
/**
* Reset resolution
*
* @param m_session AVCaptureSession instance
* @param resolution
*/
+ (void)resetSessionPreset:(AVCaptureSession *)m_session resolution:(int)resolution
{
[m_session beginConfiguration];
switch (resolution) {
case 1080:
m_session.sessionPreset = [m_session canSetSessionPreset:AVCaptureSessionPreset1920x1080] ? AVCaptureSessionPreset1920x1080 : AVCaptureSessionPresetHigh;
break;
case 720:
m_session.sessionPreset = [m_session canSetSessionPreset:AVCaptureSessionPreset1280x720] ? AVCaptureSessionPreset1280x720 : AVCaptureSessionPresetMedium;
break;
case 480:
m_session.sessionPreset = [m_session canSetSessionPreset:AVCaptureSessionPreset640x480] ? AVCaptureSessionPreset640x480 : AVCaptureSessionPresetMedium;
break;
case 360:
m_session.sessionPreset = AVCaptureSessionPresetMedium;
break;
default:
break;
}
[m_session commitConfiguration];
}
改變視頻幀率
+ (void)settingFrameRate:(int)frameRate
{
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[captureDevice lockForConfiguration:NULL];
@try {
[captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
} @catch (NSException *exception) {
NSLog(@"MediaIOS, 設(shè)備不支持所設(shè)置的分辨率坯约,錯誤信息:%@",exception.description);
} @finally {
}
[captureDevice unlockForConfiguration];
}
為視頻預(yù)覽層添加捏合手勢
在用雙手勢時,可以放大縮小所預(yù)覽的視頻莫鸭。
#define MiMaxZoomFactor 5.0f
#define MiPrinchVelocityDividerFactor 20.0f
+ (void)zoomCapture:(UIPinchGestureRecognizer *)recognizer
{
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[videoDevice formats];
if ([recognizer state] == UIGestureRecognizerStateChanged) {
NSError *error = nil;
if ([videoDevice lockForConfiguration:&error]) {
CGFloat desiredZoomFactor = videoDevice.videoZoomFactor + atan2f(recognizer.velocity, MiPrinchVelocityDividerFactor);
videoDevice.videoZoomFactor = desiredZoomFactor <= MiMaxZoomFactor ? MAX(1.0, MIN(desiredZoomFactor, videoDevice.activeFormat.videoMaxZoomFactor)) : MiMaxZoomFactor ;
[videoDevice unlockForConfiguration];
} else {
NSLog(@"error: %@", error);
}
}
}
相機操作
在視頻采集的時候闹丐,可能還伴隨有切換前后鏡頭、打開&關(guān)閉閃光燈黔龟、拍照等操作妇智。
切換相機前后鏡頭
此處切換鏡頭后,我把分辨率默認(rèn)設(shè)置為了720p,因為對于有的設(shè)備可能前置攝像頭不支持1080p氏身,所以我在此設(shè)定一個固定的720p巍棱,如果在真實的項目中,這個值應(yīng)該是你以前設(shè)定的那個值蛋欣,如果前置攝像頭不支持對應(yīng)的又不支持的策略航徙。
// 切換攝像頭
- (void)switchCamera
{
[_m_session beginConfiguration];
if ([[_video_input device] position] == AVCaptureDevicePositionBack) {
NSArray * devices = [AVCaptureDevice devices];
for(AVCaptureDevice * device in devices) {
if([device hasMediaType:AVMediaTypeVideo]) {
if([device position] == AVCaptureDevicePositionFront) {
[self rePreviewWithCameraType:MiCameraType_Front device:device];
break;
}
}
}
}else{
NSArray * devices = [AVCaptureDevice devices];
for(AVCaptureDevice * device in devices) {
if([device hasMediaType:AVMediaTypeVideo]) {
if([device position] == AVCaptureDevicePositionBack) {
[self rePreviewWithCameraType:MiCameraType_Back device:device];
break;
}
}
}
}
[_m_session commitConfiguration];
}
- (void)rePreviewWithCameraType:(MiCameraType)cameraType device:(AVCaptureDevice *)device {
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) return;
[_m_session removeInput:_video_input];
_m_session.sessionPreset = AVCaptureSessionPresetLow;
if ([_m_session canAddInput:input]) {
[_m_session addInput:input];
}else {
return;
}
_video_input = input;
_m_cameraType = cameraType;
NSString *preset = AVCaptureSessionPreset1280x720;
if([device supportsAVCaptureSessionPreset:preset] && [_m_session canSetSessionPreset:preset]) {
_m_session.sessionPreset = preset;
}else {
NSString *sesssionPreset = AVCaptureSessionPreset1280x720;
if(![sesssionPreset isEqualToString:preset]) {
_m_session.sessionPreset = sesssionPreset;
}
}
}
打開關(guān)閉閃光燈
// 打開關(guān)閉閃光燈
-(void)switchTorch
{
[_m_session beginConfiguration];
[[_video_input device] lockForConfiguration:NULL];
self.m_torchMode = [_video_input device].torchMode == AVCaptureTorchModeOn ? AVCaptureTorchModeOff : AVCaptureTorchModeOn;
if ([[_video_input device] isTorchModeSupported:_m_torchMode ]) {
[_video_input device].torchMode = self.m_torchMode;
}
[[_video_input device] unlockForConfiguration];
[_m_session commitConfiguration];
}
拍照并保存到相冊
具體的方案是:
- 設(shè)置一個flag,在視頻采集的代理方法中監(jiān)測這個flag陷虎,當(dāng)觸發(fā)了拍照動作后改變flag的值
- 在視頻采集的代理方法中判斷flag的值是否為需要拍照的裝填到踏,如果是則轉(zhuǎn)化當(dāng)前幀
CMSampleBufferRef
為UIImage,然后再把UIImage存儲到相冊中
注意:以下代碼只有指定像素格式為RGB的時候,才能保存成功一張彩色的照片到相冊尚猿。
- (UIImage *)convertSameBufferToUIImage:(CMSampleBufferRef)sampleBuffer
{
// 為媒體數(shù)據(jù)設(shè)置一個CMSampleBuffer的Core Video圖像緩存對象
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// 鎖定pixel buffer的基地址
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// 得到pixel buffer的基地址
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// 得到pixel buffer的行字節(jié)數(shù)
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// 得到pixel buffer的寬和高
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// 創(chuàng)建一個依賴于設(shè)備的RGB顏色空間
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// 用抽樣緩存的數(shù)據(jù)創(chuàng)建一個位圖格式的圖形上下文(graphics context)對象
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// 根據(jù)這個位圖context中的像素數(shù)據(jù)創(chuàng)建一個Quartz image對象
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// 解鎖pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// 釋放context和顏色空間
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
// 用Quartz image創(chuàng)建一個UIImage對象image
UIImage *image = [UIImage imageWithCGImage:quartzImage];
// 釋放Quartz image對象
CGImageRelease(quartzImage);
return (image);
}
+ (void)saveImageToSysphotos:(UIImage *)image
{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:image.CGImage metadata:nil completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(@"MediaIos, save photo to photos error, error info: %@",error.description);
}else{
NSLog(@"MediaIos, save photo success...");
}
}];
}
設(shè)置自動對焦
// 設(shè)置為自動對焦
- (void)mifocus:(UITapGestureRecognizer *)sender
{
CGPoint point = [sender locationInView:self.m_displayView];
[self miAutoFocusWithPoint:point];
NSLog(@"MediaIos, auto focus complete...");
}
- (void)miAutoFocusWithPoint:(CGPoint)touchPoint{
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([captureDevice isFocusPointOfInterestSupported] && [captureDevice isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
NSError *error;
if ([captureDevice lockForConfiguration:&error]) {
// 設(shè)置曝光點
[captureDevice setExposurePointOfInterest:touchPoint];
[captureDevice setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
// 設(shè)置對焦點
[captureDevice setFocusPointOfInterest:touchPoint];
[captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];
[captureDevice unlockForConfiguration];
}
}
}
曝光調(diào)節(jié)
// 曝光調(diào)節(jié)
- (void)changeExposure:(id)sender
{
UISlider *slider = (UISlider *)sender;
[self michangeExposure:slider.value];
}
- (void)michangeExposure:(CGFloat)value{
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
if ([device lockForConfiguration:&error]) {
[device setExposureTargetBias:value completionHandler:nil];
[device unlockForConfiguration];
}
}
設(shè)置黑白平衡
- (AVCaptureWhiteBalanceGains)recalcGains:(AVCaptureWhiteBalanceGains)gains
minValue:(CGFloat)minValue
maxValue:(CGFloat)maxValue
{
AVCaptureWhiteBalanceGains tmpGains = gains;
tmpGains.blueGain = MAX(MIN(tmpGains.blueGain , maxValue), minValue);
tmpGains.redGain = MAX(MIN(tmpGains.redGain , maxValue), minValue);
tmpGains.greenGain = MAX(MIN(tmpGains.greenGain, maxValue), minValue);
return tmpGains;
}
-(void)setWhiteBlanceUseTemperature:(CGFloat)temperature{
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
[device lockForConfiguration:nil];
AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;
CGFloat currentTint = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].tint;
AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {
.temperature = temperature,
.tint = currentTint,
};
AVCaptureWhiteBalanceGains gains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];
CGFloat maxWhiteBalanceGain = device.maxWhiteBalanceGain;
gains = [self recalcGains:gains minValue:1 maxValue:maxWhiteBalanceGain];
[device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:gains completionHandler:nil];
[device unlockForConfiguration];
}
}
// 黑白平衡調(diào)節(jié)
- (void)whiteBlanceChange:(id)sender
{
UISlider *slider = (UISlider *)sender;
[self setWhiteBlanceUseTemperature:slider.value];
}