AV Foundation ? 創(chuàng)建一個簡單的相機(jī)程序

????在 AV Foundation ? 了解捕捉媒體 了解捕捉媒體的相關(guān)內(nèi)容,捕捉會話AVCaptureSession围段、捕捉設(shè)備????AVCaptureDevice 投放、捕捉設(shè)備的輸入AVCaptureDeviceInput和 捕捉設(shè)備的輸出AVCaptureOutput 以及視頻內(nèi)容的預(yù)覽 AVCaptureVideoPreviewLayer 圖層灸芳。

????iOS 相機(jī)應(yīng)用程序允許開發(fā)者從前置和后置攝像頭捕捉照片和電影烙样。這個示例代碼項目展示了如何在自己的相機(jī)應(yīng)用程序中利用內(nèi)置前后 iPhone 攝像頭的基本功能谒获。實現(xiàn)這些捕獲功能批狱。

開發(fā)相機(jī)應(yīng)用程序需要使用真機(jī)進(jìn)行編譯和測試精耐,在編寫代碼之前需要先處理隱私請求

設(shè)置預(yù)覽視圖

相機(jī)程序需要一個實時預(yù)覽的視圖卦停,其圖層為 AVCaptureVideoPreviewLayer惊完,關(guān)聯(lián)捕捉會話AVCaptureSession小槐,保持同步,AVCaptureSession 并處理屏幕觸控點(diǎn)轉(zhuǎn)換為攝像頭坐標(biāo)系的坐標(biāo)用于聚焦和曝光

- (id)initWithFrame:(CGRect)frame {
    self = [super initWithFrame:frame];
    if (self) {
        [self setupView];
    }
    return self;
}

- (id)initWithCoder:(NSCoder *)coder {
    self = [super initWithCoder:coder];
    if (self) {
        [self setupView];
    }
    return self;
}

+ (Class)layerClass {
    //在UIView 重寫layerClass 類方法可以讓開發(fā)者創(chuàng)建視圖實例自定義圖層了下
    //重寫layerClass方法并返回AVCaptureVideoPrevieLayer類對象
    return [AVCaptureVideoPreviewLayer class];
}

- (AVCaptureSession*)session {
    //重寫session方法,返回捕捉會話
    return [(AVCaptureVideoPreviewLayer*)self.layer session];
}

- (void)setSession:(AVCaptureSession *)session {
    //重寫session屬性的訪問方法骡显,在setSession:方法中訪問視圖layer屬性惫谤。
    //AVCaptureVideoPreviewLayer 實例溜歪,并且設(shè)置AVCaptureSession 將捕捉數(shù)據(jù)直接輸出到圖層中蝴猪,并確保與會話狀態(tài)同步蛔糯。
    [(AVCaptureVideoPreviewLayer*)self.layer setSession:session];
}


//關(guān)于UI的實現(xiàn)蚁飒,例如手勢淮逻,單擊爬早、雙擊 單擊聚焦筛严、雙擊曝光
- (void)setupView {
    
    [(AVCaptureVideoPreviewLayer *)self.layer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    
    _singleTapRecognizer =
    [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(handleSingleTap:)];

    _doubleTapRecognizer =
    [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(handleDoubleTap:)];
    _doubleTapRecognizer.numberOfTapsRequired = 2;

    _doubleDoubleTapRecognizer =
    [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(handleDoubleDoubleTap:)];
    _doubleDoubleTapRecognizer.numberOfTapsRequired = 2;
    _doubleDoubleTapRecognizer.numberOfTouchesRequired = 2;

    [self addGestureRecognizer:_singleTapRecognizer];
    [self addGestureRecognizer:_doubleTapRecognizer];
    [self addGestureRecognizer:_doubleDoubleTapRecognizer];
    [_singleTapRecognizer requireGestureRecognizerToFail:_doubleTapRecognizer];

    _focusBox = [self viewWithColor:[UIColor colorWithRed:0.102 green:0.636 blue:1.000 alpha:1.000]];
    _exposureBox = [self viewWithColor:[UIColor colorWithRed:1.000 green:0.421 blue:0.054 alpha:1.000]];
    [self addSubview:_focusBox];
    [self addSubview:_exposureBox];
}

- (void)handleSingleTap:(UIGestureRecognizer *)recognizer {
    CGPoint point = [recognizer locationInView:self];
    [self runBoxAnimationOnView:self.focusBox point:point];
    if (self.delegate) {
        [self.delegate tappedToFocusAtPoint:[self captureDevicePointForPoint:point]];
    }
}

//私有方法 用于支持該類定義的不同觸摸處理方法。 將屏幕坐標(biāo)系上的觸控點(diǎn)轉(zhuǎn)換為攝像頭上的坐標(biāo)系點(diǎn)
- (CGPoint)captureDevicePointForPoint:(CGPoint)point {
    AVCaptureVideoPreviewLayer *layer =
        (AVCaptureVideoPreviewLayer *)self.layer;
    return [layer captureDevicePointOfInterestForPoint:point];
}

- (void)handleDoubleTap:(UIGestureRecognizer *)recognizer {
    CGPoint point = [recognizer locationInView:self];
    [self runBoxAnimationOnView:self.exposureBox point:point];
    if (self.delegate) {
        [self.delegate tappedToExposeAtPoint:[self captureDevicePointForPoint:point]];
    }
}

- (void)handleDoubleDoubleTap:(UIGestureRecognizer *)recognizer {
    [self runResetAnimation];
    if (self.delegate) {
        [self.delegate tappedToResetFocusAndExposure];
    }
}
- (void)runBoxAnimationOnView:(UIView *)view point:(CGPoint)point {
    view.center = point;
    view.hidden = NO;
    [UIView animateWithDuration:0.15f
                          delay:0.0f
                        options:UIViewAnimationOptionCurveEaseInOut
                     animations:^{
                         view.layer.transform = CATransform3DMakeScale(0.5, 0.5, 1.0);
                     }
                     completion:^(BOOL complete) {
                         double delayInSeconds = 0.5f;
                         dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, (int64_t)(delayInSeconds * NSEC_PER_SEC));
                         dispatch_after(popTime, dispatch_get_main_queue(), ^(void){
                             view.hidden = YES;
                             view.transform = CGAffineTransformIdentity;
                         });
                     }];
}

- (void)runResetAnimation {
    if (!self.tapToFocusEnabled && !self.tapToExposeEnabled) {
        return;
    }
    AVCaptureVideoPreviewLayer *previewLayer = (AVCaptureVideoPreviewLayer *)self.layer;
    CGPoint centerPoint = [previewLayer pointForCaptureDevicePointOfInterest:CGPointMake(0.5f, 0.5f)];
    self.focusBox.center = centerPoint;
    self.exposureBox.center = centerPoint;
    self.exposureBox.transform = CGAffineTransformMakeScale(1.2f, 1.2f);
    self.focusBox.hidden = NO;
    self.exposureBox.hidden = NO;
    [UIView animateWithDuration:0.15f
                          delay:0.0f
                        options:UIViewAnimationOptionCurveEaseInOut
                     animations:^{
                         self.focusBox.layer.transform = CATransform3DMakeScale(0.5, 0.5, 1.0);
                         self.exposureBox.layer.transform = CATransform3DMakeScale(0.7, 0.7, 1.0);
                     }
                     completion:^(BOOL complete) {
                         double delayInSeconds = 0.5f;
                         dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, (int64_t)(delayInSeconds * NSEC_PER_SEC));
                         dispatch_after(popTime, dispatch_get_main_queue(), ^(void){
                             self.focusBox.hidden = YES;
                             self.exposureBox.hidden = YES;
                             self.focusBox.transform = CGAffineTransformIdentity;
                             self.exposureBox.transform = CGAffineTransformIdentity;
                         });
                     }];
}

- (void)setTapToFocusEnabled:(BOOL)enabled {
    _tapToFocusEnabled = enabled;
    self.singleTapRecognizer.enabled = enabled;
}

- (void)setTapToExposeEnabled:(BOOL)enabled {
    _tapToExposeEnabled = enabled;
    self.doubleTapRecognizer.enabled = enabled;
}

- (UIView *)viewWithColor:(UIColor *)color {
    UIView *view = [[UIView alloc] initWithFrame:BOX_BOUNDS];
    view.backgroundColor = [UIColor clearColor];
    view.layer.borderColor = color.CGColor;
    view.layer.borderWidth = 5.0f;
    view.hidden = YES;
    return view;
}

設(shè)置捕捉會話丧慈、輸入和輸出

????捕捉會話的代碼會放置在一個相機(jī)控制器的類 CameraController 里面。這個類用于配置和管理不同的捕捉設(shè)備簇搅,同時也對捕捉的輸出進(jìn)行控制和交互馍资。首先編寫一個設(shè)置會話的方法 - (BOOL)setupSession:(NSError **)error

包括與主捕獲會話相關(guān)的輸入設(shè)備和捕獲輸出
  1. 創(chuàng)建 AVCaptureSession 實例
  2. 并添加攝像頭和麥克風(fēng)輸入, AVMediaTypeVideoAVMediaTypeAudio 類型的捕捉設(shè)備建钥,并將它們封裝成 AVCaptureDeviceInput 對象熊经。
  3. 最后添加捕捉靜態(tài)圖片 AVCaptureStillImageOutput 和 QuickTime 視頻 AVCaptureMovieFileOutput的輸出實例
- (BOOL)setupSession:(NSError **)error {

    
    //創(chuàng)建捕捉會話镐依。AVCaptureSession 是捕捉場景的中心樞紐
    self.captureSession = [[AVCaptureSession alloc]init];
    
    /*
     AVCaptureSessionPresetHigh
     AVCaptureSessionPresetMedium
     AVCaptureSessionPresetLow
     AVCaptureSessionPreset640x480
     AVCaptureSessionPreset1280x720
     AVCaptureSessionPresetPhoto
     */
    //設(shè)置圖像的分辨率
    self.captureSession.sessionPreset = AVCaptureSessionPresetHigh;
    
    //拿到默認(rèn)視頻捕捉設(shè)備 iOS系統(tǒng)返回后置攝像頭
    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    
    //將捕捉設(shè)備封裝成AVCaptureDeviceInput
    //注意:為會話添加捕捉設(shè)備槐壳,必須將設(shè)備封裝成AVCaptureDeviceInput對象
    AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:error];
    
    //判斷videoInput是否有效
    if (videoInput)
    {
        //canAddInput:測試是否能被添加到會話中
        if ([self.captureSession canAddInput:videoInput])
        {
            //將videoInput 添加到 captureSession中
            [self.captureSession addInput:videoInput];
            self.activeVideoInput = videoInput;
        }
    }else
    {
        return NO;
    }
    
    //選擇默認(rèn)音頻捕捉設(shè)備 即返回一個內(nèi)置麥克風(fēng)
    AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    
    //為這個設(shè)備創(chuàng)建一個捕捉設(shè)備輸入
    AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:error];
   
    //判斷audioInput是否有效
    if (audioInput) {
        
        //canAddInput:測試是否能被添加到會話中
        if ([self.captureSession canAddInput:audioInput])
        {
            //將audioInput 添加到 captureSession中
            [self.captureSession addInput:audioInput];
        }
    }else
    {
        return NO;
    }

    //AVCaptureStillImageOutput 實例 從攝像頭捕捉靜態(tài)圖片
    self.imageOutput = [[AVCaptureStillImageOutput alloc]init];
    
    //配置字典:希望捕捉到JPEG格式的圖片
    self.imageOutput.outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG};
    
    //輸出連接 判斷是否可用,可用則添加到輸出連接中去
    if ([self.captureSession canAddOutput:self.imageOutput])
    {
        [self.captureSession addOutput:self.imageOutput];
        
    }
    
    
    //創(chuàng)建一個AVCaptureMovieFileOutput 實例枫笛,用于將Quick Time 電影錄制到文件系統(tǒng)
    self.movieOutput = [[AVCaptureMovieFileOutput alloc]init];
    
    //輸出連接 判斷是否可用刑巧,可用則添加到輸出連接中去
    if ([self.captureSession canAddOutput:self.movieOutput])
    {
        [self.captureSession addOutput:self.movieOutput];
    }
    
    
    self.videoQueue = dispatch_queue_create("example.VideoQueue", NULL);
    
    return YES;
}

啟動和停止會話

????捕捉會話的對象圖會通過調(diào)用 setupSession:方法被妥善處置啊楚,不過在使用捕捉會話前特幔,首先要啟動會話蚯斯。有啟動會話當(dāng)初就會有啟動會話拍嵌,這里定義兩個相應(yīng)的方法:startSessionstopSession 方法供外界調(diào)用:

- (void)startSession {

    //檢查是否處于運(yùn)行狀態(tài)
    if (![self.captureSession isRunning])
    {
        //使用同步調(diào)用會損耗一定的時間横辆,則用異步的方式處理
        dispatch_async(self.videoQueue, ^{
            [self.captureSession startRunning];
        });
        
    }
}

- (void)stopSession {
    
    //檢查是否處于運(yùn)行狀態(tài)
    if ([self.captureSession isRunning])
    {
        //使用異步方式狈蚤,停止運(yùn)行
        dispatch_async(self.videoQueue, ^{
            [self.captureSession stopRunning];
        });
    }
}

切換攝像頭

????基本上所有的 iOS 設(shè)備都具有前置和后置兩個攝像頭脆侮。首先要開發(fā)的功能就是讓用戶能夠在攝像頭之間進(jìn)行切換:

  1. cameraWithPosition: 方法根據(jù)攝像頭的 position 從數(shù)組中查找相應(yīng)的攝像頭并返回靖避;
  2. activeCamera 方法返回當(dāng)前捕捉會話對應(yīng)的攝像頭;
  3. inactivceCamera 方法返回當(dāng)前激活攝像頭的反向攝像頭幻捏;
  4. canSwitchCameras 方法返回一個 BOOL 值用于表示是否有超過一個攝像頭可用篡九;
  5. cameraCount方法返回可用視頻捕捉設(shè)備的數(shù)量
  6. 最后 switchCameras 方法實現(xiàn)切換未激活攝像頭功能瓮下。
#pragma mark - Device Configuration   配置攝像頭支持的方法

- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition)position {
    
    //獲取可用視頻設(shè)備
    NSArray *devicess = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    
    //遍歷可用的視頻設(shè)備 并返回position 參數(shù)值
    for (AVCaptureDevice *device in devicess)
    {
        if (device.position == position) {
            return device;
        }
    }
    return nil;
    
    
}

- (AVCaptureDevice *)activeCamera {

    //返回當(dāng)前捕捉會話對應(yīng)的攝像頭的device 屬性
    return self.activeVideoInput.device;
}

//返回當(dāng)前未激活的攝像頭
- (AVCaptureDevice *)inactiveCamera {

    //通過查找當(dāng)前激活攝像頭的反向攝像頭獲得讽坏,如果設(shè)備只有1個攝像頭路呜,則返回nil
       AVCaptureDevice *device = nil;
      if (self.cameraCount > 1)
      {
          if ([self activeCamera].position == AVCaptureDevicePositionBack) {
               device = [self cameraWithPosition:AVCaptureDevicePositionFront];
         }else
         {
             device = [self cameraWithPosition:AVCaptureDevicePositionBack];
         }
     }

    return device;
    

}
//判斷是否有超過1個攝像頭可用
- (BOOL)canSwitchCameras {
    return self.cameraCount > 1;
}

//可用視頻捕捉設(shè)備的數(shù)量
- (NSUInteger)cameraCount {
     return [[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] count];
}

//切換攝像頭
- (BOOL)switchCameras {
    //判斷是否有多個攝像頭
    if (![self canSwitchCameras])
    {
        return NO;
    }
    
    //獲取當(dāng)前設(shè)備的反向設(shè)備
    NSError *error;
    AVCaptureDevice *videoDevice = [self inactiveCamera];
    
    
    //將輸入設(shè)備封裝成AVCaptureDeviceInput
    AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
    
    //判斷videoInput 是否為nil
    if (videoInput)
    {
        //標(biāo)注原配置變化開始
        [self.captureSession beginConfiguration];
        
        //將捕捉會話中,原本的捕捉輸入設(shè)備移除
        [self.captureSession removeInput:self.activeVideoInput];
        
        //判斷新的設(shè)備是否能加入
        if ([self.captureSession canAddInput:videoInput])
        {
            //能加入成功抵屿,則將videoInput 作為新的視頻捕捉設(shè)備
            [self.captureSession addInput:videoInput];
            
            //將獲得設(shè)備 改為 videoInput
            self.activeVideoInput = videoInput;
        }else
        {
            //如果新設(shè)備轧葛,無法加入尿扯。則將原本的視頻捕捉設(shè)備重新加入到捕捉會話中
            [self.captureSession addInput:self.activeVideoInput];
        }
        
        //配置完成后衷笋, AVCaptureSession commitConfiguration 會分批的將所有變更整合在一起辟宗。
        [self.captureSession commitConfiguration];
    }else
    {
        //創(chuàng)建AVCaptureDeviceInput 出現(xiàn)錯誤泊脐,則通知委托來處理該錯誤
        [self.delegate deviceConfigurationFailedWithError:error];
        return NO;
    }
    
    
    
    return YES;
}

配置捕捉設(shè)備

配置聚焦

  1. cameraSupportsTapToFocus 判斷是否支持聚焦
  2. focusAtPoint: 設(shè)置聚焦點(diǎn)
#pragma mark - Focus Methods 點(diǎn)擊聚焦方法的實現(xiàn)

- (BOOL)cameraSupportsTapToFocus {
    
    //詢問激活中的攝像頭是否支持興趣點(diǎn)對焦
    return [[self activeCamera]isFocusPointOfInterestSupported];
}

- (void)focusAtPoint:(CGPoint)point {
    
    AVCaptureDevice *device = [self activeCamera];
    
    //是否支持興趣點(diǎn)對焦 & 是否自動對焦模式
    if (device.isFocusPointOfInterestSupported && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
        
        NSError *error;
        //鎖定設(shè)備準(zhǔn)備配置晨抡,如果獲得了鎖
        if ([device lockForConfiguration:&error]) {
            
            //將focusPointOfInterest屬性設(shè)置CGPoint
            device.focusPointOfInterest = point;
            
            //focusMode 設(shè)置為AVCaptureFocusModeAutoFocus
            device.focusMode = AVCaptureFocusModeAutoFocus;
            
            //釋放該鎖定
            [device unlockForConfiguration];
        }else{
            //錯誤時耘柱,則返回給錯誤處理代理
            [self.delegate deviceConfigurationFailedWithError:error];
        }
        
    }
    
}

配置曝光

  1. cameraSupportsTapToExpose 判斷是否支持曝光
  2. exposeAtPoint: 設(shè)置曝光點(diǎn)
  3. 監(jiān)聽 adjustingExposure 屬性调煎,在曝光完成后士袄,鎖定曝光娄柳,并移除通知
#pragma mark - Exposure Methods   點(diǎn)擊曝光的方法實現(xiàn)

- (BOOL)cameraSupportsTapToExpose {
    
    //詢問設(shè)備是否支持對一個興趣點(diǎn)進(jìn)行曝光
    return [[self activeCamera] isExposurePointOfInterestSupported];
}

static const NSString *THCameraAdjustingExposureContext;

- (void)exposeAtPoint:(CGPoint)point {

    
    AVCaptureDevice *device = [self activeCamera];
    
    AVCaptureExposureMode exposureMode =AVCaptureExposureModeContinuousAutoExposure;
    
    //判斷是否支持 AVCaptureExposureModeContinuousAutoExposure 模式
    if (device.isExposurePointOfInterestSupported && [device isExposureModeSupported:exposureMode]) {
        
        [device isExposureModeSupported:exposureMode];
        
        NSError *error;
        
        //鎖定設(shè)備準(zhǔn)備配置
        if ([device lockForConfiguration:&error])
        {
            //配置期望值
            device.exposurePointOfInterest = point;
            device.exposureMode = exposureMode;
            
            //判斷設(shè)備是否支持鎖定曝光的模式秫筏。
            if ([device isExposureModeSupported:AVCaptureExposureModeLocked]) {
                
                //支持这敬,則使用kvo確定設(shè)備的adjustingExposure屬性的狀態(tài)崔涂。
                [device addObserver:self forKeyPath:@"adjustingExposure" options:NSKeyValueObservingOptionNew context:&THCameraAdjustingExposureContext];
                
            }
            
            //釋放該鎖定
            [device unlockForConfiguration];
            
        }else
        {
            [self.delegate deviceConfigurationFailedWithError:error];
        }
        
        
    }
    
}

- (void)observeValueForKeyPath:(NSString *)keyPath
                      ofObject:(id)object
                        change:(NSDictionary *)change
                       context:(void *)context {

    //判斷context(上下文)是否為THCameraAdjustingExposureContext
    if (context == &THCameraAdjustingExposureContext) {
        
        //獲取device
        AVCaptureDevice *device = (AVCaptureDevice *)object;
        
        //判斷設(shè)備是否不再調(diào)整曝光等級缭保,確認(rèn)設(shè)備的exposureMode是否可以設(shè)置為AVCaptureExposureModeLocked
        if(!device.isAdjustingExposure && [device isExposureModeSupported:AVCaptureExposureModeLocked])
        {
            //移除作為adjustingExposure 的self涮俄,就不會得到后續(xù)變更的通知
            [object removeObserver:self forKeyPath:@"adjustingExposure" context:&THCameraAdjustingExposureContext];
            
            //異步方式調(diào)回主隊列彻亲,
            dispatch_async(dispatch_get_main_queue(), ^{
                NSError *error;
                if ([device lockForConfiguration:&error]) {
                    
                    //修改exposureMode
                    device.exposureMode = AVCaptureExposureModeLocked;
                    
                    //釋放該鎖定
                    [device unlockForConfiguration];
                    
                }else
                {
                    [self.delegate deviceConfigurationFailedWithError:error];
                }
            });
            
        }
        
    }else
    {
        [super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
    }
    
    
}

重置對焦和曝光

//重新設(shè)置對焦&曝光
- (void)resetFocusAndExposureModes {

    
    AVCaptureDevice *device = [self activeCamera];
    
    
    
    AVCaptureFocusMode focusMode = AVCaptureFocusModeContinuousAutoFocus;
    
    //獲取對焦興趣點(diǎn) 和 連續(xù)自動對焦模式 是否被支持
    BOOL canResetFocus = [device isFocusPointOfInterestSupported]&& [device isFocusModeSupported:focusMode];
    
    AVCaptureExposureMode exposureMode = AVCaptureExposureModeContinuousAutoExposure;
    
    //確認(rèn)曝光度可以被重設(shè)
    BOOL canResetExposure = [device isFocusPointOfInterestSupported] && [device isExposureModeSupported:exposureMode];
    
    //回顧一下,捕捉設(shè)備空間左上角(0宙址,0)调卑,右下角(1,1) 中心點(diǎn)則(0.5注益,0.5)
    CGPoint centPoint = CGPointMake(0.5f, 0.5f);
    
    NSError *error;
    
    //鎖定設(shè)備丑搔,準(zhǔn)備配置
    if ([device lockForConfiguration:&error]) {
        
        //焦點(diǎn)可設(shè)啤月,則修改
        if (canResetFocus) {
            device.focusMode = focusMode;
            device.focusPointOfInterest = centPoint;
        }
        
        //曝光度可設(shè)谎仲,則設(shè)置為期望的曝光模式
        if (canResetExposure) {
            device.exposureMode = exposureMode;
            device.exposurePointOfInterest = centPoint;
            
        }
        
        //釋放鎖定
        [device unlockForConfiguration];
        
    }else
    {
        [self.delegate deviceConfigurationFailedWithError:error];
    }
    
    
    
    
}

配置閃光燈和手電筒

#pragma mark - Flash and Torch Modes    閃光燈 & 手電筒

//判斷是否有閃光燈
- (BOOL)cameraHasFlash {

    return [[self activeCamera]hasFlash];

}

//閃光燈模式
- (AVCaptureFlashMode)flashMode {

    
    return [[self activeCamera]flashMode];
}

//設(shè)置閃光燈
- (void)setFlashMode:(AVCaptureFlashMode)flashMode {

    //獲取會話
    AVCaptureDevice *device = [self activeCamera];
    
    //判斷是否支持閃光燈模式
    if ([device isFlashModeSupported:flashMode]) {
    
        //如果支持,則鎖定設(shè)備
        NSError *error;
        if ([device lockForConfiguration:&error]) {

            //修改閃光燈模式
            device.flashMode = flashMode;
            //修改完成,解鎖釋放設(shè)備
            [device unlockForConfiguration];
            
        }else
        {
            [self.delegate deviceConfigurationFailedWithError:error];
        }
        
    }

}

//是否支持手電筒
- (BOOL)cameraHasTorch {

    return [[self activeCamera]hasTorch];
}

//手電筒模式
- (AVCaptureTorchMode)torchMode {

    return [[self activeCamera]torchMode];
}


//設(shè)置是否打開手電筒
- (void)setTorchMode:(AVCaptureTorchMode)torchMode {

    
    AVCaptureDevice *device = [self activeCamera];
    
    if ([device isTorchModeSupported:torchMode]) {
        
        NSError *error;
        if ([device lockForConfiguration:&error]) {
            
            device.torchMode = torchMode;
            [device unlockForConfiguration];
        }else
        {
            [self.delegate deviceConfigurationFailedWithError:error];
        }

    }
    
}

配置視頻縮放

  1. 首先是 cameraSupportsZoom:方法的實現(xiàn),通過當(dāng)前選中的 AVCaptureDevice獲取它的活動 AVCaptureDeviceFormat十拣,如果格式的 videoMaxZoomFactor 值大于 1.0夭问,則設(shè)備支持縮放功能缰趋;
  2. maxZoomFactor 方法要確定最大允許縮放因子
  3. setZoomValue: 方法提供改變縮放等級
  4. rampZoomToValue: 方法支持在一段時間內(nèi)從當(dāng)前值到 zoomValue的縮放
  5. 取消縮放
  6. 監(jiān)聽 videoZoomFactorrampingVideoZoom 用于頁面更新
- (BOOL)cameraSupportsZoom {
    return self.activeCamera.activeFormat.videoMaxZoomFactor > 1.0f;        // 1
}

- (CGFloat)maxZoomFactor {
    return MIN(self.activeCamera.activeFormat.videoMaxZoomFactor, 4.0f);    // 2
}

- (void)setZoomValue:(CGFloat)zoomValue {                                   // 3
    if (!self.activeCamera.isRampingVideoZoom) {

        NSError *error;
        if ([self.activeCamera lockForConfiguration:&error]) {              // 4

            // Provide linear feel to zoom slider
            CGFloat zoomFactor = pow([self maxZoomFactor], zoomValue);      // 5
            self.activeCamera.videoZoomFactor = zoomFactor;

            [self.activeCamera unlockForConfiguration];                     // 6

        } else {
            [self.delegate deviceConfigurationFailedWithError:error];
        }
    }
}

- (void)rampZoomToValue:(CGFloat)zoomValue {                                // 1
    CGFloat zoomFactor = pow([self maxZoomFactor], zoomValue);
    NSError *error;
    if ([self.activeCamera lockForConfiguration:&error]) {
        [self.activeCamera rampToVideoZoomFactor:zoomFactor                 // 2
                                        withRate:THZoomRate];
        [self.activeCamera unlockForConfiguration];
    } else {
        [self.delegate deviceConfigurationFailedWithError:error];
    }
}

- (void)cancelZoom {                                                        // 3
    NSError *error;
    if ([self.activeCamera lockForConfiguration:&error]) {
        [self.activeCamera cancelVideoZoomRamp];                            // 4
        [self.activeCamera unlockForConfiguration];
    } else {
        [self.delegate deviceConfigurationFailedWithError:error];
    }
}

- (void)addVideoZoomFactorObserver{
    [self.activeCamera addObserver:self                                 // 2
                        forKeyPath:@"videoZoomFactor"
                           options:0
                           context:NULL];
    [self.activeCamera addObserver:self                                 // 3
                        forKeyPath:@"rampingVideoZoom"
                           options:0
                           context:NULL];
}

- (void)observeValueForKeyPath:(NSString *)keyPath
                      ofObject:(id)object
                        change:(NSDictionary *)change
                       context:(void *)context {

    if ([keyPath  isEqualToString: @"videoZoomFactor"]) {
        [self updateZoomingDelegate];                                       // 4
    } else if ([keyPath  isEqualToString: @"rampingVideoZoom"]) {
        if (self.activeCamera.isRampingVideoZoom) {
            [self updateZoomingDelegate];                                   // 5
        }
    } else {
        [super observeValueForKeyPath:keyPath
                             ofObject:object
                               change:change
                              context:context];
    }
}

- (void)updateZoomingDelegate {
    CGFloat curZoomFactor = self.activeCamera.videoZoomFactor;
    CGFloat maxZoomFactor = [self maxZoomFactor];
    CGFloat value = log(curZoomFactor) / log(maxZoomFactor);                // 6
    [self.zoomingDelegate rampedZoomToValue:value];                         // 7
}

拍攝靜態(tài)圖片

????AVCaptureStillImageOutput 類定義了 captureStillImageAsynchronouslyFromConnection:completionHandler: 方法來執(zhí)行實際的拍攝,定義一個 captureStillImage 方法用于拍照按鈕執(zhí)行:

  1. 通過 connectionWithMediaType: 獲取 AVCaptureConnection 指針
  2. 設(shè)置連接connection 的視頻方向 videoOrientation;
  3. 定義一個 completition handler 塊仔涩,在內(nèi)部接收一個有效的 CMSampleBuffer熔脂,調(diào)用 AVCaptureStillImageOutput 類的 jpegStillImageNSDataRepresentation: 獲取表示圖片字節(jié)的 NSData霞揉,并轉(zhuǎn)換為 UIImage 實例
#pragma mark - Image Capture Methods 拍攝靜態(tài)圖片

//獲取方向值
- (AVCaptureVideoOrientation)currentVideoOrientation {
    
    AVCaptureVideoOrientation orientation;
    
    //獲取UIDevice 的 orientation
    switch ([UIDevice currentDevice].orientation) {
        case UIDeviceOrientationPortrait:
            orientation = AVCaptureVideoOrientationPortrait;
            break;
        case UIDeviceOrientationLandscapeRight:
            orientation = AVCaptureVideoOrientationLandscapeLeft;
            break;
        case UIDeviceOrientationPortraitUpsideDown:
            orientation = AVCaptureVideoOrientationPortraitUpsideDown;
            break;
        default:
            orientation = AVCaptureVideoOrientationLandscapeRight;
            break;
    }
    
    return orientation;

    return 0;
}

/*
    AVCaptureStillImageOutput 是AVCaptureOutput的子類。用于捕捉圖片
 */
- (void)captureStillImage {
    
    //獲取連接
    AVCaptureConnection *connection = [self.imageOutput connectionWithMediaType:AVMediaTypeVideo];
    
    //程序只支持縱向隶症,但是如果用戶橫向拍照時蚂会,需要調(diào)整結(jié)果照片的方向
    //判斷是否支持設(shè)置視頻方向
    if (connection.isVideoOrientationSupported) {
        
        //獲取方向值
        connection.videoOrientation = [self currentVideoOrientation];
    }
    
    //定義一個handler 塊胁住,會返回1個圖片的NSData數(shù)據(jù)
    id handler = ^(CMSampleBufferRef sampleBuffer,NSError *error)
                {
                    if (sampleBuffer != NULL) {
                        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:sampleBuffer];
                        UIImage *image = [[UIImage alloc]initWithData:imageData];
                        
                        //捕捉圖片成功后彪见,寫入圖片
                  
                    }else
                    {
                        NSLog(@"NULL sampleBuffer:%@",[error localizedDescription]);
                    }
                        
                };
    
    //捕捉靜態(tài)圖片
    [self.imageOutput captureStillImageAsynchronouslyFromConnection:connection completionHandler:handler];
    
    
    
}

視頻捕捉

????AVCaptureMovieFileOutput 類大多數(shù)核心功能繼承于父類 AVCaptureFileOutput余指,AVCaptureFileOutput 類定義了許多實用功能酵镜。比如錄制到最長時限或錄制到特定文件大小時為止淮韭。還可以配置成保留最小可用的磁盤空間靠粪。這一點(diǎn)在存儲空間有限的移動設(shè)備上錄制視頻時非常重要庇配。

????通常當(dāng) QuickTime影片準(zhǔn)備發(fā)布時捞慌,影片頭的元數(shù)據(jù)處于文件的開始位置啸澡。這樣可以讓視頻播放器快速讀取頭包含信息嗅虏,來確定文件的內(nèi)容皮服、結(jié)構(gòu)和其包含的多個樣本的位置龄广。不過,當(dāng)錄制一個 QuickTime 影片時择同,直到所有的樣片都完成捕捉后才能創(chuàng)建信息頭裹纳。當(dāng)錄制結(jié)束時剃氧,創(chuàng)建頭數(shù)據(jù)并將它附在文件結(jié)尾處她我。

????將創(chuàng)建頭的過程放在所有影片樣本完成捕捉之后存在一個問題番舆,尤其是在移動設(shè)備的情況下恨狈。如果遇到崩潰或其他中斷呛讲,比如有電話撥入贝搁,則影片頭就不會被正確寫入雷逆,會在磁盤生成一個不可讀的影片文件膀哲。 AVCaptureMovieFileOutput 提供一個核心功能就是分段捕捉 QuickTime 影片仿村。

????當(dāng)錄制開始時蔼囊,在文件最前面寫入一個最小化的頭信息衣迷,隨著錄制的進(jìn)行畏鼓,片段按照一定的周期寫入,創(chuàng)建完整的頭信息蘑险。默認(rèn)狀態(tài)下滴肿,每10秒寫入一個片段,不過這個時間的間隔可以通過修改捕捉設(shè)備輸出的 movieFragentInterval 屬性來改變佃迄。

  1. isRecording 判斷是否在錄制狀態(tài)

  2. startRecording 方法中獲取處理當(dāng)前視頻捕捉連接的信息泼差,設(shè)置 enablesVideoStabilizationWhenAvailable 提搞視頻穩(wěn)定性,設(shè)置攝像頭平滑對焦模式smoothAutoFocusEnabled 降低對焦操作的速率堆缘,根據(jù)指定文件路徑調(diào)用 startRecordingToOutputFileURL:recordingDelegate: 開始錄制

  3. stopRecording 停止錄制

  4. 實現(xiàn) AVCaptureFileOutputRecordingDelegate 協(xié)議中的captureOutput:didFinishRecordingToOutputFileAtURL:fromConnections: error: 方法,接收返回信息進(jìn)行處理

    視頻錄制的委托回調(diào)時間線
#pragma mark - Video Capture Methods 捕捉視頻

//判斷是否錄制狀態(tài)
- (BOOL)isRecording {

    return self.movieOutput.isRecording;
}

//開始錄制
- (void)startRecording {

    if (![self isRecording]) {
        
        //獲取當(dāng)前視頻捕捉連接信息普碎,用于捕捉視頻數(shù)據(jù)配置一些核心屬性
        AVCaptureConnection * videoConnection = [self.movieOutput connectionWithMediaType:AVMediaTypeVideo];
        
        //判斷是否支持設(shè)置videoOrientation 屬性吼肥。
        if([videoConnection isVideoOrientationSupported])
        {
            //支持則修改當(dāng)前視頻的方向
            videoConnection.videoOrientation = [self currentVideoOrientation];
            
        }
        
        //判斷是否支持視頻穩(wěn)定 可以顯著提高視頻的質(zhì)量。只會在錄制視頻文件涉及
        if([videoConnection isVideoStabilizationSupported])
        {
            videoConnection.enablesVideoStabilizationWhenAvailable = YES;
        }
        
        
        AVCaptureDevice *device = [self activeCamera];
        
        //攝像頭可以進(jìn)行平滑對焦模式操作。即減慢攝像頭鏡頭對焦速度缀皱。當(dāng)用戶移動拍攝時攝像頭會嘗試快速自動對焦斗这。
        if (device.isSmoothAutoFocusEnabled) {
            NSError *error;
            if ([device lockForConfiguration:&error]) {
                
                device.smoothAutoFocusEnabled = YES;
                [device unlockForConfiguration];
            }else
            {
                [self.delegate deviceConfigurationFailedWithError:error];
            }
        }
        
        //查找寫入捕捉視頻的唯一文件系統(tǒng)URL.
        self.outputURL = [self uniqueURL];
        
        //在捕捉輸出上調(diào)用方法 參數(shù)1:錄制保存路徑  參數(shù)2:代理
        [self.movieOutput startRecordingToOutputFileURL:self.outputURL recordingDelegate:self];
        
    }
    
    
}

- (CMTime)recordedDuration {
    
    return self.movieOutput.recordedDuration;
}


//寫入視頻唯一文件系統(tǒng)URL
- (NSURL *)uniqueURL {

    NSFileManager *fileManager = [NSFileManager defaultManager];
    
    //temporaryDirectoryWithTemplateString  可以將文件寫入的目的創(chuàng)建一個唯一命名的目錄;
    NSString *dirPath = [fileManager temporaryDirectoryWithTemplateString:@"kamera.XXXXXX"];
    
    if (dirPath) {
        
        NSString *filePath = [dirPath stringByAppendingPathComponent:@"kamera_movie.mov"];
        return  [NSURL fileURLWithPath:filePath];
        
    }
    
    return nil;
    
}

//停止錄制
- (void)stopRecording {

    //是否正在錄制
    if ([self isRecording]) {
        [self.movieOutput stopRecording];
    }
}

#pragma mark - AVCaptureFileOutputRecordingDelegate

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
      fromConnections:(NSArray *)connections
                error:(NSError *)error {

    //錯誤
    if (error) {
        [self.delegate mediaCaptureFailedWithError:error];
    }else
    {
        //寫入圖片
        
        
    }
    
    self.outputURL = nil;
    

}
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末啤斗,一起剝皮案震驚了整個濱河市表箭,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌钮莲,老刑警劉巖免钻,帶你破解...
    沈念sama閱讀 218,755評論 6 507
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場離奇詭異崔拥,居然都是意外死亡极舔,警方通過查閱死者的電腦和手機(jī),發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 93,305評論 3 395
  • 文/潘曉璐 我一進(jìn)店門链瓦,熙熙樓的掌柜王于貴愁眉苦臉地迎上來拆魏,“玉大人,你說我怎么就攤上這事澡绩』遥” “怎么了?”我有些...
    開封第一講書人閱讀 165,138評論 0 355
  • 文/不壞的土叔 我叫張陵肥卡,是天一觀的道長溪掀。 經(jīng)常有香客問我,道長步鉴,這世上最難降的妖魔是什么揪胃? 我笑而不...
    開封第一講書人閱讀 58,791評論 1 295
  • 正文 為了忘掉前任,我火速辦了婚禮氛琢,結(jié)果婚禮上喊递,老公的妹妹穿的比我還像新娘。我一直安慰自己阳似,他們只是感情好骚勘,可當(dāng)我...
    茶點(diǎn)故事閱讀 67,794評論 6 392
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著撮奏,像睡著了一般俏讹。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上畜吊,一...
    開封第一講書人閱讀 51,631評論 1 305
  • 那天泽疆,我揣著相機(jī)與錄音,去河邊找鬼玲献。 笑死殉疼,一個胖子當(dāng)著我的面吹牛梯浪,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播瓢娜,決...
    沈念sama閱讀 40,362評論 3 418
  • 文/蒼蘭香墨 我猛地睜開眼挂洛,長吁一口氣:“原來是場噩夢啊……” “哼!你這毒婦竟也來了恋腕?” 一聲冷哼從身側(cè)響起抹锄,我...
    開封第一講書人閱讀 39,264評論 0 276
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎荠藤,沒想到半個月后,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體获高,經(jīng)...
    沈念sama閱讀 45,724評論 1 315
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡哈肖,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 37,900評論 3 336
  • 正文 我和宋清朗相戀三年,在試婚紗的時候發(fā)現(xiàn)自己被綠了念秧。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片淤井。...
    茶點(diǎn)故事閱讀 40,040評論 1 350
  • 序言:一個原本活蹦亂跳的男人離奇死亡,死狀恐怖摊趾,靈堂內(nèi)的尸體忽然破棺而出币狠,到底是詐尸還是另有隱情,我是刑警寧澤砾层,帶...
    沈念sama閱讀 35,742評論 5 346
  • 正文 年R本政府宣布漩绵,位于F島的核電站,受9級特大地震影響肛炮,放射性物質(zhì)發(fā)生泄漏止吐。R本人自食惡果不足惜,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 41,364評論 3 330
  • 文/蒙蒙 一侨糟、第九天 我趴在偏房一處隱蔽的房頂上張望碍扔。 院中可真熱鬧,春花似錦秕重、人聲如沸不同。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,944評論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽二拐。三九已至,卻和暖如春汰具,著一層夾襖步出監(jiān)牢的瞬間卓鹿,已是汗流浹背。 一陣腳步聲響...
    開封第一講書人閱讀 33,060評論 1 270
  • 我被黑心中介騙來泰國打工留荔, 沒想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留吟孙,地道東北人澜倦。 一個月前我還...
    沈念sama閱讀 48,247評論 3 371
  • 正文 我出身青樓,卻偏偏與公主長得像杰妓,于是被迫代替她去往敵國和親藻治。 傳聞我的和親對象是個殘疾皇子,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 44,979評論 2 355

推薦閱讀更多精彩內(nèi)容