一. ?捕捉會話AVCaptureSession.
AVCaptureSession用于連接輸入和輸出的資源. 捕捉會話管理從物理設(shè)備(比如: 攝像頭, 麥克風(fēng))得到的數(shù)據(jù)流, 輸出到一個或多個目的地. 可以動態(tài)的配置輸入和輸出線路, 讓開發(fā)者能夠在會話中重新配置捕捉環(huán)境.
捕捉會話可以配置會話預(yù)設(shè)值(session preset), 用來控制捕捉數(shù)據(jù)的格式和質(zhì)量. 會話設(shè)置默認:?AVCaptureSessionPresetHigh
二. 捕捉設(shè)備
AVCaptureDevice用于訪問系統(tǒng)的捕捉設(shè)備, 最常用的方法: defaultDeviceWithMediaType:?
AVCaptureDeveice為攝像頭, 麥克風(fēng)等物理設(shè)備定義了接口, 這些設(shè)備內(nèi)置于Mac, iPhone中, 也可能是外部數(shù)碼相機或者攝像機等. AVCaptureDevice針對物理設(shè)備提供了大量的方法, 比如控制攝像頭的對焦, 曝光, 白平衡或者閃光燈等.
三. 捕捉設(shè)備的輸入
在使用捕捉設(shè)備進行處理前, 需要將其加入到會話中, 捕捉設(shè)備沒法直接添加到AVCaptureSession中, 需要將他封裝在AVCaptureDeviceInput實例中. AVCaptureDeviceInput對象在設(shè)備輸出數(shù)據(jù)和捕捉會話間扮演橋接作用. AVCaptureDeviceInput的創(chuàng)建方式
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
四. 捕捉設(shè)備的輸出
AVFoundation定義了AVCaptureOutput的許多擴展類,?AVCaptureOutput是一個抽象基類. 用于為捕捉會話得到的數(shù)據(jù)尋找輸出目的地. 框架定義了一下這個類的高級擴展類.
AVCaptureDeviceStillImageOutput: 捕捉靜態(tài)圖片.
AVCaptureMovieFileOutput: 捕捉音頻和視頻數(shù)據(jù)
底層的輸出類: AVCaptureAudioDataOutput和AVCaptureVideoDataOutput, 使用它們可以直接訪問硬件捕捉到數(shù)字樣本. 使用底層的輸出類可以對音頻和視頻進行實時處理.
五. 捕捉連接
AVCaptureConenction: 捕捉輸入和輸出的連接, 可以用來啟用或者禁用給定輸入或給定輸出的數(shù)據(jù)流. 也可以使用連接來監(jiān)聽音頻信道中的平均和峰值.
六. 捕捉預(yù)覽
AVCaptureVideoPreviewLayer: 對捕捉數(shù)據(jù)進行實時預(yù)覽, 類似于AVPlayerLayer, 不過針對攝像頭的捕捉進行了定制. AVCaptureVideoPreviewLayer也支持視頻重力的概念, 可以控制視頻內(nèi)容的縮放和拉伸效果.
AVLayerGravityResizeAspect: 保持視頻寬高比
AVLayerGravityResizeAspectFill: 保持寬高比, 填滿整個圖層, 會造成視頻裁剪
AVLayerGravityResize: 填滿圖層, 視頻變形.
//捕捉會話的創(chuàng)建
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; ? ?
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
if ([session canAddInput:input]) {[session addInput:input];}
AVCaptureStillImageOutput *output = [ [AVCaptureStillImageOutput alloc] init];
if([session canAddOutput:output]) {[session addOutput:output];}
七.知識點
1. 坐標空間轉(zhuǎn)換, 捕捉設(shè)備坐標系和屏幕坐標系不同左上角, 捕捉設(shè)備坐標系是基于攝像頭傳感器的本地設(shè)置, 水平方向不可旋轉(zhuǎn), 左上角為(0, 0), 右下角為(1, 1)
2. AVCaptureVideoPreViewLayer提供了兩個方法用于兩個坐標系間進行轉(zhuǎn)換.?
? ? captureDevicePointOfInterestForPoint: 獲取屏幕坐標系point, 返回設(shè)備坐標系point
? ? pointForCaptureDeviceOfInterest: 獲取攝像頭坐標系, 返回屏幕坐標系
? ? 點擊對焦和點擊曝光通常會用到轉(zhuǎn)換坐標.
3. 捕捉會話設(shè)置
// CameraController
- (Bool)setupSession {
????????self.captureSession = [[AVCaptureSession alloc] init]; ? ? ? ?//創(chuàng)建捕捉會話
? ? ? ? AVCaptureDevice *videoDevice = [AVCaptureDevice deviceWithMediaType: AVMediaTypeVideo]; ? ?//得到系統(tǒng)默認捕捉設(shè)備指針, 返回手機后置攝像頭
? ? ? ? AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice: videoDevice error: error]; ? ?
? ? ? ? if (videoInput && [self.captureSession canAddInput: videoInput]) {
? ? ? ? ? ? ? ? [self.captureSession addInput: videoInput];
? ? ? ? ? ? ? ? self.activeVideoInput = videoInput;
????????} else {
? ? ? ? ? ? ? ? return NO;
????????}
? ? ? ? AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMeidaTypeAudio]; ? ? ? ?//創(chuàng)建捕捉設(shè)備的輸入
? ? ? ? if (audioDevice && [self.captureSession canAddInput:audioInput]) {
? ? ? ? ? ? ? ?[ self.captureSession addInput: audioInput]
????????} else {
? ? ? ? ? ? ? ? return NO;
????????}
? ? ? ? self.imageOutput = [[AVCaptureStillImageOutput alloc] init]; ? ?//捕捉靜態(tài)圖片
? ? ? ? self.imageOutput.outputSettings = @{AVVideoCodeKey: VAVideoCodecJPEG};
? ? ? ? if ([self.captureSession canAddOutput: self.imageOutput]) {
? ? ? ? ? ? ? ? [self.captureSession addOutput: self.imageOutput];
????????}
? ? ? ? self.movieOutput = [[AVCaptureMovieFileOutput alloc] init]; //保存到文件系統(tǒng)
? ? ? ? if ([self.captureSession canAddOutput: self.movieOutput]) {
? ? ? ? ? ? ? ? [self.captureSession addOutput: self.movieOutput];
????????}
? ? ? ? self.videoQueue = dispatch_queue_create("com.vedioqueue", NULL);
? ? ? ? return YES;
}
4. 啟動和停止會話, 使用捕捉會話之前首先要啟動會話, 啟動會話的第一步是啟動數(shù)據(jù)流, 使他處于準備捕捉圖片和視頻的狀態(tài). a: 檢查設(shè)備是否處于活躍狀態(tài), 如果沒有準備好則調(diào)用startRunning方法, 這是一個同步調(diào)用會消耗一定的時間, 所以要異步方式在videoQueue排隊調(diào)用. b: stopRunning停止系統(tǒng)中數(shù)據(jù)流, 也是一個同步調(diào)用, 所以也要采用異步方式調(diào)用.
- (void)startSession { //a
? ? ? ? if (![self.captureSession isRunning]) {
? ? ? ? ? ? ? ? dispatch_async(self.videoQueue, ^{
? ? ? ? ? ? ? ? ? ? ? ? [self.captureSession startRunning];
????????????????});
????????}
}
- (void)stopSession { ? ?//b
? ? ? ? if ([self.captureSession isRunning]) {
? ? ? ? ? ? ? ? dispatch_async(self.videoQueue, ^{
? ? ? ? ? ? ? ? ? ? ? ? [self.captureSession stopRunning];
????????????????});
? ? ? ?}
}
5. 切換攝像頭
- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition)position {
? ?NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMidaTypeVideo];
? ? for (AVCaptureDevice *device in devices) {
? ? ? ? ? ? if (device.position == position) {
? ? ? ? ? ? ? ? ? ? return device;
????????????}
? ? ? ? ? ? return nil;
????}
}
- (AVCaputreDevice *)activeCamera {
? ? return self.activeVideoInput.device;
}
- (AVCAptureDevice *)inactiveCamera {
? ? ? ? AVCaptureDevice *device = nil;
? ? ? ? if (self.cameraCount > 1) {
? ? ? ? ? ? ? ? if ?([self activeCamera].position == AVCaptureDevicePositionBack) {
? ? ? ? ? ? ? ? ? ? ? ? device = [self cameraWithPosition:?AVCaptureDevicePositionFront];
????????????????} else {
? ? ? ? ? ? ? ? ? ? ? ? device = [self cameraWithPosition:?AVCaptureDevicePositionBack];
????????????????}
????????}
? ? ? ? return device;
}
- (BOOL)canSwitchCameras {
? ? ? ? return self.cameraCount > 1;
}
- (NSUInteger)cameraCount {
? ? ? ? return [[AVCaptureDevice deviceWithMediaType:AVMediaTypeVideo] count];
}
切換前置和后置攝像頭需要重新配置捕捉會話, 幸運的是可以動態(tài)配置AVCaptureSession, 所以不必擔心停止會話和重啟會話帶來的開銷, 不過對會話進行的任何改變都要通過beginConfiguration和commitConfiguration進行單獨的原子性的變化.
- (void)switchCameras {
? ? ? ? if (![self canSwitchCameras]) { ? ?return NO;????}
? ? ? ? NSError *error;
? ? ? ? AVCaptureDevice *videoDevice = [self inactiveCamera];
? ? ? ? AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
? ? ? ? if (videoInput) {
? ? ? ? ? ? ? ? [self.captureSession beginConfiguration]; ? ?//標注原子配置變化開始
? ? ? ? ? ? ? ? [self.captureSession removeInput: self.activeVideoInput];
? ? ? ? ? ? ? ? if ([self.captureSession canAddInput: videoInput]) {
? ? ? ? ? ? ? ? ? ? ? ? [self.captureSession addInput: videoInput];
? ? ? ? ? ? ? ? ? ? ? ? self.activeVideoInput = videoInput;
? ? ? ? ? ? ? ? } else {
? ? ? ? ? ? ? ? ? ? ? ? [self.captureSession addInput:self.activeInput]; ? ?//防止添加失敗
? ? ? ? ? ? ? ? }
? ? ? ? ? ? ? ?//分批將所有變更整合在一起, 得到一個有關(guān)會話單獨的原子的修改
? ? ? ? ? ? ? ? [self.captureSession commitConfiguration];
????????} else {
? ? ? ? ? ? ? ? return NO; ? ? ? ?//返回之前可以做一些處理
????????}
? ? ? ? return YES;
}
6. 配置捕捉設(shè)備, AVCaptureDevice定義了很多方法可以讓開發(fā)者控制攝像頭,尤其是可以獨立調(diào)整和鎖定攝像頭的焦距, 曝光和白平衡. 對焦和曝光還可以支持特定的興趣點設(shè)置, 實現(xiàn)點擊曝光和對焦的功能. AVCaptureDevice還可以控制設(shè)備的LED作為拍照的閃光燈或手電筒使用. 每當修改攝像頭設(shè)備時, 一定要先檢查該動作是否被設(shè)備支持, 否則會出現(xiàn)異常. 并不是所有攝像頭都能支持所有的功能. 例如: 前置攝像頭不支持對焦. 后置攝像頭可以支持全尺寸對焦.
AVCaptureDevice *device = ...
if ([devcie isFocusModeSupported: AVCaptureFocusModeAutoFocus]) {
? ? //當配置修改可以支持時, 修改技巧為: 先鎖定設(shè)備準備配置, 執(zhí)行所需要的修改, 最后解鎖配置
? ? if ([device lockForConfiguration: &error]) {
? ? ? ? ? ? device.focusMode = AVCaptureFocusModeAutoFocus;
? ? ? ? ? ? [device unlockForConfiguration];
????} else {
? ? ? ? ? ? //handle error
????}
}
7. 調(diào)整焦距, a: 詢問攝像頭是否支持興趣點對焦 b: 傳遞一個point, 已經(jīng)從屏幕坐標系轉(zhuǎn)化為捕捉設(shè)備坐標. c: 是否支持興趣點對焦并確認是否支持自動對焦模式, 這一模式會使用單獨掃描的自動對焦.
點擊對焦的實現(xiàn):
- (BOOL)cameraSupportTapToFocus {???? //a
? ? ? ? return [[self activeCamera] isFocusPointOfInterestSupported];
}
- (void)focusAtPoint:(CGPoint)point {????//b
? ? ? ? AVCaptureDevice *device = [self activeCamera];
? ? ? ? if (device.isFocusPointOfInterestSupported && [device isFocusModeSupported: AVCaptureFocusModeAutoFocus]) { ? ?//c
? ? ? ? ? ? ? ? if ([device lockForConfiguration: &error]) {
? ? ? ? ? ? ? ? ? ? ? ? device.focusPointOfInterest = point;
? ? ? ? ? ? ? ? ? ? ? ? [device unlockForConfiguration];
????????????????} else {
? ? ? ? ? ? ? ? ? ? ? ? //錯誤處理
????????????????}
????????}
}
8. 點擊曝光, a: 是否支持對興趣點曝光. b: 判斷設(shè)備是否支持鎖定曝光模式, 如果支持使用KVO來確定設(shè)備adjustingExposure屬性的狀態(tài), 觀察該屬性可以知道曝光何時調(diào)整完成, 讓我們有機會在改點上鎖定曝光. c: 判斷設(shè)備不在調(diào)整曝光等級, 確認設(shè)備的exposureMode可以設(shè)置為AVCaptureExposueModeLocked.
- (BOOL)cameraSupportsTapToExpose { ? ?//a
? ? ? ? return [[self activeCamera] isExposurePointOfInterestSupported];
}
static const NSString *THCameraAdjustingExposureContext;
- (void)exposeAtPoint:(CGPoint)point {
? ? ? ? AVCaptureDevice *device = [self activeCamera];
? ? ? ? AVCaptureExposureMode exposureMode = AVCaptureExposureModeContinuousAutoExposure;
? ? ? ? if (device.isExposurePointOfInterestSuppoted && [device isExposureModeSupported: exposureMode]) { ? ?
? ? ? ? ? ? ? ? if ([device lockForConfiguration:&error]) {
? ? ? ? ? ? ? ? ? ? ? ? device.exposurePointOfInterest = point;
? ? ? ? ? ? ? ? ? ? ? ? device.exposureMode = exposureMode;
? ? ? ? ? ? ? ? ? ? ? ? if ([device isExposureModeSupported: AVCaptureExposureModelLocked]) { ? ?//b
? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? [device addObserve: self forKeyPath: @"adjustingExposure" oprions: NSKeyValueObservingOptionNes context: &THCameraAdjustingExposureContext];
????????????????????????}
? ? ? ? ? ? ? ? ? ? ? ? [device unlockForConfiguration];
????????????????}
????????} else {
? ? ? ? // handle error ? ? ? ?
????????}
????}
}
- (void)observeValueForKeyPath:(NSSting *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
? ? if (context == &THCameraAdjustingExposureContext) {
? ? ? ? AVCaptureDevice *device = (AVCaptureDevice *)object;
? ? ? ? if (!device.isAjustingExposure && [device isExposureModeSupported: AVCaptureExposureModeLocked]) { ? ?//c
? ? ? ? ? ? [object removeObserver: self forKeyPath: @"ajustingExposure" ...];
? ? ? ? ? ? dispatch_async(dispatch_get_main_queue(), ^{
? ? ? ? ? ? ? ? if ([device lockForConfiguration: &error]) {
? ? ? ? ? ? ? ? ? ? device.exposureMode = AVCaptureEXposureModeLocked;
? ? ? ? ? ? ? ? ? ? [device unlockConfiguration];
????????????????} else {
? ? ? ? ? ? ? ? ? ? ?//錯誤處理
????????????????}
????????????});
? ? ? } else {
? ? ? ? [super observeValueForKeyPath: keyPath ofObject:....];
????}
}
9. 重新設(shè)置對焦和曝光
- (void)resetFocusAndExposureMode {
? ? AVCaptureDevice *device = [self activeCamera];
? ? AVCaptureFocusMode focusMode = AVCaptureFocusModeContinuousAutoFocus;
? ? BOOL canResetFocus = [device isFocusPointOfInterestSupported] && [device isFocusModeSupported:focusMode];
? ? AVCaptureExpsureMode exposureMode = AVCaptureExposureModeContinuousAutoExposure;
? ? BOOL canResetExposure = [device isExposurePointOfInterestSuppported] && [device isFocusModeSupported:focusMode];
? ? CGPoint centerPoint = CGPointMake(0.5, 0.5);
? ? NSError *error;
? ? if ([device lockForConfiguration:&error]) {
? ? ? ? if (canResetFocus) {
? ? ? ? ? ? device.focusMode = focusMode;
? ? ? ? ? ? device.focusPointOfInterest = centerPoint;
????????}
? ? ? ? if (canResetExposure) {
? ? ? ? ? ? device.exposureMode = exposureMode;
? ? ? ? ? ?device.exposurePointOfInterest = centerPoint;
????????}
? ? ? ? [device unlockForConfiguration];
????} else {
? ? ? ? //錯誤處理
????}
}
10. 調(diào)整閃光燈和手電筒模式, AVCaptureDevice可以讓開發(fā)者修改攝像頭的閃光燈和手電筒模式, 設(shè)備后面的LED燈, 當拍攝靜態(tài)圖片時用作閃光燈, 當拍攝視頻時用作連續(xù)的燈光(手電筒). 捕捉設(shè)備的flashMode和torchMode屬性可以被設(shè)置為AVCapture(Torch|Flash)(On| Off | Auto), 3個值中的一個.
- (BOOL)cameraHasFlash {
? ? return [[self activeCamera] hasFlash];
}
- (AVCaptureFlashMode)flashMode {
? ? return [[self activeCamera] flashMode];
}
- (void)setFlashMode:(AVCaptureFlashMode)flashMode {
? ? AVCaptureDevice *device = [self activeCamera];
? ? if ([device isFlashModeSupported:flashMode]) {
? ? ? ? if ([device lockForConfiguration: &error]) {
? ? ? ? ? ? device.flashMode = flashMode;
? ? ? ? ? ? [device unlockForConfiguration];
????????} else {
? ? ? ? ? ? //錯誤處理
????????}
????}
}
- (BOOL)cameraHasTorch {
? ? return [[self activeCamera] hasTorch];
}
- (AVCaptureTorchMode)torchMode {
? ? return [[self activeCamera] torchMode];
}
- (void)setTorchMode:(AVCaptureTorchMode)torchMode {
? ? AVCaptureDevice *device = [self activeCamera];
? ? if ([device isTorchModeSupported:torchMode]) {
? ? ? ? NSError *error;
? ? ? ? if ([device lockForConfiguration: &error]) {
? ? ? ? ? ? device.torchMode = torchMode;
? ? ? ? ? ? [device unlockForConfiguration];
????????} else {
? ? ? ? ? ? //異常處理
????? ? }
????}
}
11. 拍攝靜態(tài)圖片, 在setupSession方法的實現(xiàn)中, 我們將一個AVCaptureStillImageOutput實例添加到捕捉會話中, 這個是AVCaptureOutput子類, 用于捕捉靜態(tài)圖片. a: 獲取AVCaptureStillImageOutput對象使用的當前AVCaptureConnection指針, 當查找AVCaptureStillImageOutput連接時一般會傳遞AVMediaTypeVideo媒體類型.
- (void)captureStillImage {
? ? AVCaptureConnection *connection = [self.imageOutput connectionWithMediaType:AVMediaTypeVideo]; ? ?//a
? ? if (connection.isVideoOrientationSupported) {
? ? ? ? connection.videoOrientation = [self currentVideoOrientation];
????}
? ? id handler = ^(CMSampleBufferRef sampleBuffer, NSError *error) {
? ? ? ? if (sampleBuffer != NULL) {
? ? ? ? ? ? NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDatRepresentation:sampleBuffer];
? ? ? ? ? ? UIImage *image = [[UIImage alloc] initWithData:imageData];
????????} else {
? ? ? ? ? ? //錯誤處理
? ? ? ? ?}
????};
? ? [self.imageOutput captureStillImageAsynchronouslyFromConnection:connection ?completionHandler:handler];
}
- (AVCaptureVideoOrientation)currentVideoOrientation {
? ? AVCaptureVideoOrientation orientation;
? ? switch ([UIDevice currentDevice].orientation) {
? ? ? ? case UIDeviceOrientationPortrait:?
? ? ? ? ? ? ? ? ? ? orientation =?UIDeviceOrientationPortrait;
? ? ? ? ? ? ? ? ? ? break;
? ? ? ? case??UIDeviceOrientationLandscapeRight:
? ? ? ? ? ? ? ? ? ? orientation =??UIDeviceOrientationLandscapeLeft;
? ? ? ? ? ? ? ? ? ? break;
? ? ? ? case?UIDeviceOrientationUpsideDown:
????????????????????orientation =?UIDeviceOrientationUpsideDown;? ? ? ? ? ? ??
? ? ? ? ? ? ? ? ? ? break;
? ? ? ? default:
? ? ? ? ? ? ? ? orientation =?UIDeviceOrientationLandscapeRight;
? ? ? ? ? ? ? ? break;
????}
? ? return orientation;
}
12. 使用Assets Library框架, Assets Library可以讓開發(fā)者管理用戶的相冊和視頻庫.
AVAuthorizationStatus status = [ALAssetLibrary authorizationStatus];
if (status == AVAuthorizationStatusDenied) { ? ?
????//without access
} else { ??
?????//perform authorized access to the library
}
- (void)writeImageToAssetsLibrary:(UIImage *)image { ? ?//將圖片寫到本地
? ? ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
? ? [library writeImageToSavePhotosAlbum:image.CGImage orientation:image.orientation completionHander:^(NSURL *assetURL, NSError *error){ ? ? ? ? ? ? ??
? ? ? }];
}
13. 視頻捕捉, AVCaptureMovieFileOutput開始錄制時, 會在頭文件前面寫入一個最小化的頭信息, 隨著錄制的進行, 片段會按照一定的周期寫入. 創(chuàng)建完整的頭信息. 間隔可以通過修改捕捉輸出的movieFragmentInterval屬性來改變. a: 判斷AVCaptureMovieFileOutput狀態(tài). b: 視頻穩(wěn)定, 支持穩(wěn)定可以顯著提高捕捉到視頻的質(zhì)量. c:攝像頭平滑對焦模式, 減慢攝像頭對焦的速度. 通常情況下用戶移動攝像頭會嘗試快速自動對焦, 這會在捕捉的視頻中出現(xiàn)脈沖式效果, 平滑對焦會降低對焦速度, 從而提供更加自然的錄制效果.?
- (BOOL)isRecording { ? ?//a
? ? return self.movieOutput.isRecording;
}
- (void)startRecording {
? ? if (![self isRecording]) {
? ? ? ? AVCaptureConnection *videoConnection = [self.movieOutput connectionWithMediaType:AVMediaTypeVideo];
? ? ? ? if ([videoConnection isVideoOrientationSupported]) {
? ? ? ? ? ? videoConnection.videoOrientation = self.currentVideoOrientation;
????????}
? ? ? ? if ([videoConnection isVideoStabilizationSupported]) { ? ?//b
? ? ? ? ? ? videoConnection.enablesVideoStabilizationWhenAvailable = YES;
????????}
? ? ? ? AVCaptureDevice *device = [self activeCamera];
? ? ? ? if (device.isSmoothAutoFocusSupported) { ? ?//c
? ? ? ? ? ? NSError *error;
? ? ? ? ? ? if ([device lockForConfiguration: &error]) {
? ? ? ? ? ? ? ? ? ?device.smoothAutoFocusEnabled = YES; ? ? ? ? ?
? ? ? ? ? ? ? ? ? ? [device unlockForConfiguration];
????????????} else {
? ? ? ? ? ? ? ? ? ? //錯誤處理
????????????}
????????}
? ? ? ? self.outputURL = [self uniqueURL];
? ? ? ? [self.movieOutput startRecordingToOutputFileURL:self.outputURL recordingDelegate:self];
????}
}
- (NSURL *)uniqueURL {
? ? NSFileManager *fileManager = [NSFileManager defaultManager];
? ? NSString *dirPath = [fileManager temporaryDirectoryWithTemplateString:@"temp"];
? ? if (dirPath) {
? ? ? ? NSString *filePath = [dirPath stringByAppendingPatchComponent:@"1.mov"];
? ? ? ? return [NSURL fileURLWithPath:filePath];
????} ? ?
? ? return nil;
}
- (void)stopRecording {
? ? if ([self isRecording]) {
? ? ? ? [self.movieOutput stopRecording];
????}
}
14. 實現(xiàn)AVCaptureFileOutputRecordingDelegate協(xié)議, a:向資源庫寫入前檢查視頻是否可以被寫入. ?b:根據(jù)視頻的寬高比設(shè)置圖片高度, 還需要設(shè)置appliesPreferredTrackTransform為YES, 這樣捕捉縮略圖時會考慮視頻的變化.
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
? ? if (error) {
? ? ? ? [self.delegate mediaCaptureFailedWithError:error];
? ? } else {
? ? ? ? [self writeVideoToAssetsLibrary:[self.outputURL copy]];
????}
? ? self.outURL = nil;
}
- (void)writeVideoToAssetsLibrary:(NSURL *)videoURL {
? ? ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
? ? if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:videoURL]) { ? ?//a
? ? ? ? ALAssetsLibraryWriteVideoCompletionBlock completionBlock;
? ? ? ? completionBlock = ^(){
? ? ? ? ? ? if (error) {
? ? ? ? ? ? ? ? [self.delegate assetLibraryWriteFailedWithError:error];
????????????} else {
? ? ? ? ? ? ? ? [self generateThumbnailForVideoAtURL:videoURL];
????????????}
? ? ? ? };
? ??????? ? ? ? ? ? [library writeVideoAtPathToSavePhotosAlbum:videoURL completionBlock:completionBlock];
????}
}
- (void)generateThumbnailForVideoAtURL:(NSURL *)videoURL {
? ? dispatch_async(self.videoQueue, ^{
? ? ? ? AVAsset *asset = [AVAsset assetWithURL:videoURL];
? ? ? ? AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
????????imageGenerator.maximumSize = CGSizeMake(100.0f, 0.0f); ? ?//b? ? ? ?
? ??????imageGenerator.appliesPreferredTrackTransform = YES;
? ? ? ? CGImageRef imageRef = [imageGenerator copyCGImageAtTime:kCMTimeZero actualTime:NULL error: nil];
? ? ? ? UIImage *image = [UIImage imageWithCGImage:imageRef];
? ? ? ? CGImageRelease(imageRef);
? ? ? ? dispatch_async(dispatch_get_main_queue(), ^{
? ? ? ? ? ? //...
????????});
? ? });
}