概述
AVFoundation 是一個(gè)可以用來(lái)使用和創(chuàng)建基于時(shí)間的視聽(tīng)媒體數(shù)據(jù)的框架瑞眼。AVFoundation 的構(gòu)建考慮到了目前的硬件環(huán)境和應(yīng)用程序,其設(shè)計(jì)過(guò)程高度依賴多線程機(jī)制。充分利用了多核硬件的優(yōu)勢(shì)并大量使用block和GCD機(jī)制厢漩,將復(fù)雜的計(jì)算機(jī)進(jìn)程放到了后臺(tái)線程運(yùn)行。會(huì)自動(dòng)提供硬件加速操作,確保在大部分設(shè)備上應(yīng)用程序能以最佳性能運(yùn)行疮胖。該框架就是針對(duì)64位處理器設(shè)計(jì)的,可以發(fā)揮64位處理器的所有優(yōu)勢(shì)闷板。
切換攝像頭
一般來(lái)說(shuō)iPhone都具有前后兩個(gè)攝像頭澎灸,在做相機(jī)應(yīng)用的時(shí)候一個(gè)基本的需求就是前后攝像頭的切換。在切換相機(jī)的時(shí)候我們需要注意的是我們都需要判斷當(dāng)前設(shè)備支不支持相機(jī)切換遮晚、用戶是否開(kāi)啟了相應(yīng)的權(quán)限性昭、以及設(shè)置的圖片尺寸前后攝像頭是否支持這樣的尺寸。
#pragma mark - 切換攝像頭
- (AVCaptureDevice *)deviceWithPostion:(AVCaptureDevicePosition)position
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if (device.position == position) {
return device;
}
}
return nil;
}
- (BOOL)canSwitchCamera
{
return [[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] count] > 1;
}
- (void)switchCamera
{
if (![self canSwitchCamera]) {
return;
}
AVCaptureDevicePosition devicePosition;
if (self.deviceInput.device.position == AVCaptureDevicePositionBack) {
devicePosition = AVCaptureDevicePositionFront;
}else {
devicePosition = AVCaptureDevicePositionBack;
}
[self.captureSession beginConfiguration];
[self.captureSession removeInput:_deviceInput];
NSError *error;
AVCaptureDevice *device = [self deviceWithPostion:devicePosition];
self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!self.deviceInput) {
[self.captureSession commitConfiguration];
return;
}
[self.captureSession addInput:self.deviceInput];
[self.captureSession commitConfiguration];
}
調(diào)整焦距
iOS設(shè)備大多都支持基于給定的興趣點(diǎn)設(shè)置對(duì)焦县遣,我們只需要傳入一個(gè)位置糜颠,系統(tǒng)就會(huì)自動(dòng)在當(dāng)前位置進(jìn)行對(duì)焦。 需要注意的是萧求,定要先設(shè)置位置其兴,再設(shè)置曝光模式。需要注意的是夸政,要先設(shè)置位置元旬,再設(shè)置對(duì)焦模式。這里的 CGPoint 取值范圍是取景框左上角(0,0)到取景框右下角(1法绵,1)之間箕速。
#pragma mark - 自動(dòng)對(duì)焦
- (void)autoFocus
{
if (!self.deviceInput.device) {
return;
}
if ([self.deviceInput.device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
NSError *error;
if ([self.deviceInput.device lockForConfiguration:&error]) {
self.deviceInput.device.focusMode = AVCaptureFocusModeAutoFocus;
[self.deviceInput.device unlockForConfiguration];
}
}
}
#pragma mark - 調(diào)整焦距
- (BOOL)canTapFoucus
{
return [self.deviceInput.device isFocusPointOfInterestSupported];
}
- (void)focusAtPoint:(CGPoint)point
{
if (![self canTapFoucus]) {
return;
}
if ([self.deviceInput.device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
NSError *error;
if ([self.deviceInput.device lockForConfiguration:&error]) {
self.deviceInput.device.focusPointOfInterest = point;
self.deviceInput.device.focusMode = AVCaptureFocusModeAutoFocus;
[self.deviceInput.device unlockForConfiguration];
}
}
}
曝光
iOS設(shè)備大多都支持基于給定的興趣點(diǎn)設(shè)置曝光數(shù)據(jù),我們只需要傳入一個(gè)位置朋譬,系統(tǒng)就會(huì)自動(dòng)在當(dāng)前位置進(jìn)行曝光盐茎。 需要注意的是,定要先設(shè)置位置徙赢,再設(shè)置曝光模式字柠。這里的 CGPoint 取值范圍是取景框左上角(0,0)到取景框右下角(1狡赐,1)之間窑业。
#pragma mark - 曝光
- (BOOL)canTapExpose
{
return [self.deviceInput.device isExposurePointOfInterestSupported];
}
- (void)exposeAtPoint:(CGPoint)point
{
if (![self canTapExpose]) {
return;
}
if ([self.deviceInput.device isExposureModeSupported:AVCaptureExposureModeAutoExpose]) {
NSError *error;
if ([self.deviceInput.device lockForConfiguration:&error]) {
self.deviceInput.device.exposurePointOfInterest = point;
self.deviceInput.device.exposureMode = AVCaptureExposureModeAutoExpose;
[self.deviceInput.device unlockForConfiguration];
}
}
}
閃光燈
閃光燈的使用比較簡(jiǎn)單,有以下幾個(gè)模式:AVCaptureFlashModeOff 枕屉、AVCaptureFlashModeOn常柄、AVCaptureFlashModeAuto。
#pragma mark - 閃光燈
- (BOOL)haveFlash
{
return [self.deviceInput.device hasFlash];
}
- (AVCaptureFlashMode)currentFlashMode
{
return self.deviceInput.device.flashMode;
}
- (void)setFlashModel:(AVCaptureFlashMode)flashModel
{
if (self.deviceInput.device.flashMode == flashModel) {
return;
}
if ([self.deviceInput.device isFlashModeSupported:flashModel]) {
NSError *error;
if ([self.deviceInput.device lockForConfiguration:&error]) {
self.deviceInput.device.flashMode = flashModel;
[self.deviceInput.device unlockForConfiguration];
}
}
}
手電筒
手電筒的使用比較簡(jiǎn)單搀擂,有以下幾個(gè)模式:AVCaptureTorchModeOff西潘,AVCaptureTorchModeOn,AVCaptureTorchModeAuto 哨颂。
#pragma mark - 手電筒
- (BOOL)haveTorch
{
return [self.deviceInput.device hasTorch];
}
- (AVCaptureTorchMode)currentTorchMode
{
return self.deviceInput.device.torchMode;
}
- (void)setTorchModel:(AVCaptureTorchMode)torchModel
{
if (self.deviceInput.device.torchMode == torchModel) {
return;
}
if ([self.deviceInput.device isTorchModeSupported:torchModel]) {
NSError *error;
if ([self.deviceInput.device lockForConfiguration:&error]) {
self.deviceInput.device.torchMode = torchModel;
[self.deviceInput.device unlockForConfiguration];
}
}
}
- (void)setTorchLevel:(float)torchLevel
{
if ([self.deviceInput.device isTorchActive]) {
NSError *error;
if ([self.deviceInput.device lockForConfiguration:&error]) {
[self.deviceInput.device setTorchModeOnWithLevel:torchLevel error:&error];
[self.deviceInput.device unlockForConfiguration];
}
}
}
保存圖片
使用ALAssetsLibrary喷市,我們可以很方便地將照片或視頻寫入用戶的資源庫(kù)中。在使用的時(shí)候需要注意用戶權(quán)限的控制威恼。
#pragma mark - 保存圖片
- (void)writeImageToPhotosAlbum:(UIImage *)image
{
ALAssetsLibrary *assetsLib = [[ALAssetsLibrary alloc] init];
[assetsLib writeImageToSavedPhotosAlbum:image.CGImage
orientation:(NSInteger)image.imageOrientation
completionBlock:^(NSURL *assetURL, NSError *error) {
NSLog(@"%@", assetURL);
}];
}
視頻縮放
縮放的屬性是videoMaxZoomFactor品姓,它的最小值是1.0(不進(jìn)行縮放),最大值由videoMaxZoomFactor確定箫措。
#pragma mark - 視頻縮放
- (BOOL)videoCanZoom
{
return self.deviceInput.device.activeFormat.videoMaxZoomFactor > 1.0f;
}
- (float)videoMaxZoomFactor
{
return MIN(self.deviceInput.device.activeFormat.videoMaxZoomFactor, 4.0f);
}
- (void)setVideoZoomFactor:(float)factor
{
if (self.deviceInput.device.isRampingVideoZoom) {
return;
}
NSError *error;
if ([self.deviceInput.device lockForConfiguration:&error]) {
self.deviceInput.device.videoZoomFactor = pow([self videoMaxZoomFactor], factor);
[self.deviceInput.device unlockForConfiguration];
}
}
- (void)rampZoomToFactor:(float)factor
{
if (self.deviceInput.device.isRampingVideoZoom) {
return;
}
NSError *error;
if ([self.deviceInput.device lockForConfiguration:&error]) {
[self.deviceInput.device rampToVideoZoomFactor:pow([self videoMaxZoomFactor], factor) withRate:1.0f];
[self.deviceInput.device unlockForConfiguration];
}
}
CoreVideo渲染
為什么要用快速紋理上傳腹备,相比OpenGLES快速紋理上傳大大加快了紋理上傳的速度。這也是GPUImage中為什么優(yōu)先使用快速紋理上傳的原因斤蔓。在閱讀GPUImage源碼的時(shí)候你可以看到這樣的注釋:// Note: the fast texture caches speed up 640x480 frame reads from 9.6 ms to 3.1 ms on iPhone 4S
詳見(jiàn)框架中的 GPUImageRawDataOutput.m 文件
- (void)setupOpenGLTextureCache
{
CVReturn statuts = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault,
NULL,
_context,
NULL,
&_openGLESTextureCache);
if (statuts != kCVReturnSuccess) {
exit(0);
}
}
#pragma mark - GLTexture
- (void)genTetureFromImage:(CVImageBufferRef)imageRef
{
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
_openGLESTextureCache,
imageRef,
NULL,
GL_TEXTURE_2D,
GL_RGBA,
(GLsizei)CVPixelBufferGetWidth(imageRef),
(GLsizei)CVPixelBufferGetHeight(imageRef),
GL_BGRA,
GL_UNSIGNED_BYTE,
0,
&_openGLESTexture);
glBindTexture(GL_TEXTURE_2D, CVOpenGLESTextureGetName(_openGLESTexture));
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glUniform1i(glGetUniformLocation(_program, "image"), 0);
if (_openGLESTexture) {
CFRelease(_openGLESTexture);
_openGLESTexture = NULL;
CVOpenGLESTextureCacheFlush(_openGLESTextureCache, 0);
}
}
在這里不再介紹如何實(shí)時(shí)渲染相機(jī)視頻馏谨,當(dāng)前在本文的例子中完成了相機(jī)視頻的實(shí)時(shí)渲染,具體請(qǐng)參考之前的文章或本文的相關(guān)示例附迷。
參考
AVFoundation開(kāi)發(fā)秘籍:實(shí)踐掌握iOS & OSX應(yīng)用的視聽(tīng)處理技術(shù)
源碼地址:AVFoundation開(kāi)發(fā) https://github.com/QinminiOS/AVFoundation