通常我們用到的二維碼掃描可能還是首先想到ZXing或者ZBar皱碘,但是用第三方的會有很多不變压昼,第一是因入庫文件占用內(nèi)存較大送滞,第二就是使用不方便,效率不夠高瞧省;所以在iOS7系統(tǒng)中蘋果引入了自己的二維碼掃描機制扯夭,那效率必然是無話可說的,廢話不多說鞍匾,直接開始——
1.引入頭文件
#import <AVFoundation/AVFoundation.h>
2.聲明相關(guān)類屬性
//設(shè)備屬性
@property (strong,nonatomic)AVCaptureDevice *device;
@property (strong,nonatomic)AVCaptureDeviceInput *input;
@property (strong,nonatomic)AVCaptureMetadataOutput *output;
@property (strong,nonatomic)AVCaptureSession *session;
@property (strong,nonatomic)AVCaptureVideoPreviewLayer *preview;
3.設(shè)置代理方法
@interface TA_qrcode_home ()<AVCaptureMetadataOutputObjectsDelegate>
4.掃描主體實現(xiàn)
#pragma mark -----初始化相機設(shè)備等掃描控件------
- (void)setupCamera
{
// Device
_device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if(_device == nil)
{
UIAlertView *alert = [[UIAlertView alloc]initWithTitle:@"未檢測到相機設(shè)備" message:@"未檢測到相機設(shè)備" delegate:self cancelButtonTitle:nil otherButtonTitles:@"確定", nil];
[alert show];
return ;
}
// Input
_input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];
// Output
_output = [[AVCaptureMetadataOutput alloc]init];
[_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
// Session
_session = [[AVCaptureSession alloc]init];
//限制掃描區(qū)域
CGSize size = self.view.bounds.size;
CGRect cropRect = self.sweepFrame;
CGFloat p1 = size.height/size.width;
CGFloat p2 = 1920./1080.; //使用了1080p的圖像輸出
if (p1 < p2) {
CGFloat fixHeight = size.width * 1920. / 1080.;
CGFloat fixPadding = (fixHeight - size.height)/2;
_output.rectOfInterest = CGRectMake((cropRect.origin.y + fixPadding)/fixHeight,cropRect.origin.x/size.width,cropRect.size.height/fixHeight,cropRect.size.width/size.width);
} else {
CGFloat fixWidth = size.height * 1080. / 1920.;
CGFloat fixPadding = (fixWidth - size.width)/2;
_output.rectOfInterest = CGRectMake(cropRect.origin.y/size.height,(cropRect.origin.x + fixPadding)/fixWidth,cropRect.size.height/size.height,cropRect.size.width/fixWidth);
}
[_session setSessionPreset:AVCaptureSessionPresetHigh];
if ([_session canAddInput:self.input])
{
[_session addInput:self.input];
}
if ([_session canAddOutput:self.output])
{
[_session addOutput:self.output];
}
NSLog(@"%@",[_output availableMetadataObjectTypes]);
// 掃碼類型 AVMetadataObjectTypeQRCode交洗,此為支持所有掃描類型
[_output setMetadataObjectTypes:[NSArray arrayWithObjects:AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeCode128Code, AVMetadataObjectTypeQRCode, nil]];
// Preview
_preview =[AVCaptureVideoPreviewLayer layerWithSession:self.session];
_preview.videoGravity = AVLayerVideoGravityResizeAspectFill;
_preview.frame =CGRectMake(0,0,ScreenFrameWidth,ScreenFrameHeight-44);
[self.view.layer insertSublayer:self.preview atIndex:0];
// Start
[_session startRunning];
}
5.如何做到掃描區(qū)域的限制
參看連接,非常詳盡講解:http://www.reibang.com/p/3fb24fc7b415
6.代理回調(diào)方法
#pragma mark -----AVCaptureMetadataOutputObjectsDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection
{
if ([metadataObjects count] >0)
{
AVMetadataMachineReadableCodeObject * metadataObject = [metadataObjects objectAtIndex:0];
self.resultStr = metadataObject.stringValue;
}
[self stopDevice];
...
}
-(void)stopDevice
{
[_session stopRunning];
_session = nil;
[self.preview removeFromSuperlayer];
[self.sweepViewHandler.timer invalidate];
}
7.視圖區(qū)域的繪制(好多人以為這個設(shè)置出來效果就是我們平時的視圖小效果就是調(diào)用設(shè)備即可橡淑,其實不然构拳,是需要手動繪制的,這里不多贅述)
這篇文章寫得不錯,有需要可以參考 http://www.reibang.com/p/05949cc8f7af