FFmpeg_allluckly.cn.png
Mac編譯ffmpeg獲取FFmpeg-iOS
ffmpeg的H.264解碼
FFmpeg-iOS推流器的簡(jiǎn)單封裝
今天咱來(lái)講講在iOS 平臺(tái)上利用ffmpeg獲取到攝像頭和麥克風(fēng)鸟蟹,代碼很少溪食,后面再加上iOS 自帶的獲取攝像頭的例子;
FFmpeg獲取攝像頭麥克風(fēng)
- 首先導(dǎo)入必要的頭文件
#include <stdio.h>
#ifdef __cplusplus
extern "C"
{
#endif
#include <libavcodec/avcodec.h>
#include <libavformat/avformat.h>
#include <libswscale/swscale.h>
#include <libavdevice/avdevice.h>
#ifdef __cplusplus
};
#endif
具體代碼簡(jiǎn)單封裝了一下西雀,如下:
- (void)showDevice{
avdevice_register_all();
AVFormatContext *pFormatCtx = avformat_alloc_context();
AVDictionary* options = NULL;
av_dict_set(&options,"list_devices","true",0);
AVInputFormat *iformat = av_find_input_format("avfoundation");
printf("==AVFoundation Device Info===\n");
avformat_open_input(&pFormatCtx,"",iformat,&options);
printf("=============================\n");
if(avformat_open_input(&pFormatCtx,"0",iformat,NULL)!=0){
printf("Couldn't open input stream.\n");
return ;
}
}
運(yùn)行一下可以看到日志區(qū)域的打印信息如下:
==AVFoundation Device Info===
2017-07-20 16:59:36.325150+0800 LBffmpegDemo[2040:821433] [MC] System group container for systemgroup.com.apple.configurationprofiles path is /private/var/containers/Shared/SystemGroup/systemgroup.com.apple.configurationprofiles
2017-07-20 16:59:36.326529+0800 LBffmpegDemo[2040:821433] [MC] Reading from public effective user settings.
[AVFoundation input device @ 0x145d0100] AVFoundation video devices:
[AVFoundation input device @ 0x145d0100] [0] Back Camera
[AVFoundation input device @ 0x145d0100] [1] Front Camera
[AVFoundation input device @ 0x145d0100] AVFoundation audio devices:
[AVFoundation input device @ 0x145d0100] [0] iPhone 麥克風(fēng)
=============================
[avfoundation @ 0x153ef800] Selected framerate (29.970030) is not supported by the device
[avfoundation @ 0x153ef800] Supported modes:
[avfoundation @ 0x153ef800] 192x144@[1.000000 30.000000]fps
[avfoundation @ 0x153ef800] 192x144@[1.000000 30.000000]fps
[avfoundation @ 0x153ef800] 352x288@[1.000000 30.000000]fps
[avfoundation @ 0x153ef800] 352x288@[1.000000 30.000000]fps
[avfoundation @ 0x153ef800] 480x360@[1.000000 30.000000]fps
[avfoundation @ 0x153ef800] 480x360@[1.000000 30.000000]fps
[avfoundation @ 0x153ef800] 640x480@[1.000000 30.000000]fps
[avfoundation @ 0x153ef800] 640x480@[1.000000 30.000000]fps
[avfoundation @ 0x153ef800] 960x540@[1.000000 30.000000]fps
[avfoundation @ 0x153ef800] 960x540@[1.000000 30.000000]fps
[avfoundation @ 0x153ef800] 1280x720@[1.000000 30.000000]fps
[avfoundation @ 0x153ef800] 1280x720@[1.000000 30.000000]fps
[avfoundation @ 0x153ef800] 1280x720@[1.000000 60.000000]fps
[avfoundation @ 0x153ef800] 1280x720@[1.000000 60.000000]fps
[avfoundation @ 0x153ef800] 1920x1080@[1.000000 30.000000]fps
[avfoundation @ 0x153ef800] 1920x1080@[1.000000 30.000000]fps
[avfoundation @ 0x153ef800] 2592x1936@[1.000000 20.000000]fps
[avfoundation @ 0x153ef800] 2592x1936@[1.000000 20.000000]fps
[avfoundation @ 0x153ef800] 3264x2448@[1.000000 20.000000]fps
[avfoundation @ 0x153ef800] 3264x2448@[1.000000 20.000000]fps
Couldn't open input stream.
顯然獲取到了我們的設(shè)備顶籽,前后攝像頭厨钻,和麥克風(fēng);下面我們看看系統(tǒng)自帶的獲取攝像頭的例子:
iOS系統(tǒng)自帶獲取攝像頭
- 首先導(dǎo)入必須的頭文件
#import <AVFoundation/AVFoundation.h>
#import <UIKit/UIKit.h>
- 然后是一些全局的屬性
@property(nonatomic, strong) AVCaptureSession *captureSession;
@property(nonatomic, strong) AVCaptureDevice *captureDevice;
@property(nonatomic, strong) AVCaptureDeviceInput *captureDeviceInput;
@property(nonatomic, strong) AVCaptureVideoDataOutput *captureVideoDataOutput;
@property(nonatomic, assign) CGSize videoSize;
@property(nonatomic, strong) AVCaptureConnection *videoCaptureConnection;
@property(nonatomic, strong) AVCaptureVideoPreviewLayer *previewLayer;
- 最后是簡(jiǎn)單封裝的代碼
- (void)getMovieDevice:(UIView *)view{
self.captureSession = [[AVCaptureSession alloc] init];
// captureSession.sessionPreset = AVCaptureSessionPresetMedium;
self.captureSession.sessionPreset = AVCaptureSessionPreset1920x1080;
self.videoSize = [self getVideoSize:self.captureSession.sessionPreset];
self.captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
self.captureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:self.captureDevice error:&error];
if([self.captureSession canAddInput:self.captureDeviceInput])
[self.captureSession addInput:self.captureDeviceInput];
else
NSLog(@"Error: %@", error);
dispatch_queue_t queue = dispatch_queue_create("myEncoderH264Queue", NULL);
self.captureVideoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
[self.captureVideoDataOutput setSampleBufferDelegate:self queue:queue];
#if encodeModel
// nv12
NSDictionary *settings = [[NSDictionary alloc] initWithObjectsAndKeys:
[NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange],
kCVPixelBufferPixelFormatTypeKey,
nil];
#else
// 32bgra
NSDictionary *settings = [[NSDictionary alloc] initWithObjectsAndKeys:
[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA],
kCVPixelBufferPixelFormatTypeKey,
nil];
#endif
self.captureVideoDataOutput.videoSettings = settings;
self.captureVideoDataOutput.alwaysDiscardsLateVideoFrames = YES;
if ([self.captureSession canAddOutput:self.captureVideoDataOutput]) {
[self.captureSession addOutput:self.captureVideoDataOutput];
}
// 保存Connection南捂,用于在SampleBufferDelegate中判斷數(shù)據(jù)來(lái)源(是Video/Audio?)
self.videoCaptureConnection = [self.captureVideoDataOutput connectionWithMediaType:AVMediaTypeVideo];
#pragma mark -- AVCaptureVideoPreviewLayer init
self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
self.previewLayer.frame = view.layer.bounds;
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; // 設(shè)置預(yù)覽時(shí)的視頻縮放方式
[[self.previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationPortrait]; // 設(shè)置視頻的朝向
[self.captureSession startRunning];
[view.layer addSublayer:self.previewLayer];
}
- (CGSize)getVideoSize:(NSString *)sessionPreset {
CGSize size = CGSizeZero;
if ([sessionPreset isEqualToString:AVCaptureSessionPresetMedium]) {
size = CGSizeMake(480, 360);
} else if ([sessionPreset isEqualToString:AVCaptureSessionPreset1920x1080]) {
size = CGSizeMake(1920, 1080);
} else if ([sessionPreset isEqualToString:AVCaptureSessionPreset1280x720]) {
size = CGSizeMake(1280, 720);
} else if ([sessionPreset isEqualToString:AVCaptureSessionPreset640x480]) {
size = CGSizeMake(640, 480);
}
return size;
}
#pragma mark -- AVCaptureVideo(Audio)DataOutputSampleBufferDelegate method
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
// 這里的sampleBuffer就是采集到的數(shù)據(jù)了旧找,但它是Video還是Audio的數(shù)據(jù)溺健,得根據(jù)connection來(lái)判斷
if (connection == self.videoCaptureConnection) {
// Video
// NSLog(@"在這里獲得video sampleBuffer,做進(jìn)一步處理(編碼H.264)");
#if encodeModel
// encode
#else
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// int pixelFormat = CVPixelBufferGetPixelFormatType(pixelBuffer);
// switch (pixelFormat) {
// case kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange:
// NSLog(@"Capture pixel format=NV12");
// break;
// case kCVPixelFormatType_422YpCbCr8:
// NSLog(@"Capture pixel format=UYUY422");
// break;
// default:
// NSLog(@"Capture pixel format=RGB32");
// break;
// }
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
// render
[openglView render:pixelBuffer];
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
#endif
}
// else if (connection == _audioConnection) {
//
// // Audio
// NSLog(@"這里獲得audio sampleBuffer钮蛛,做進(jìn)一步處理(編碼AAC)");
// }
}
到此iOS平臺(tái)獲取攝像頭告一段落鞭缭,有時(shí)間再慢慢寫(xiě)FFmpeg在iOS平臺(tái)的一些其他的使用方法;有對(duì)ffmpeg感興趣的朋友可以關(guān)注我魏颓!??
我的博客即將搬運(yùn)同步至騰訊云+社區(qū)岭辣,邀請(qǐng)大家一同入駐:https://cloud.tencent.com/developer/support-plan?invite_code=3i16zjhqnn0gw