- ARKit點擊屏幕增加文字
- ARKit點擊屏幕增加3D模型
- ARKit檢測到平面自動增加3D模型
- QuickLook的最簡單使用
- ARKit人臉貼圖
- ARKit微笑檢測
- ARKit皺眉檢測
- ARKit人臉參數BlendShapes詳解
- demo
1. ARKit點擊屏幕增加文字
1.點擊屏幕增加文字.gif
-
command+shift+n新建一個項目氯哮,然后選擇
Augmented Reality App
-
在Content Technology中選擇
SpriteKit
即可
控制文字距離相機的距離(改變這個Z感受一下變化)
matrix_float4x4 translation = matrix_identity_float4x4;
translation.columns[3].z = -1;
2. ARKit點擊屏幕增加3D模型
2.點擊屏幕增加3D模型.gif
2.1 畫面捕捉
主要就是三個類:
-
ARSCNView
: 畫面顯示 -
ARConfiguration
: 捕捉畫面-
ARWorldTrackingConfiguration
:后置攝像頭 -
ARFaceTrackingConfiguration
:前置攝像頭,會實時監(jiān)測面部表情特征
-
-
ARSession
:數據中轉
在viewDidLoad
的時候初始化資源
self.arSCNView = [[ARSCNView alloc] initWithFrame:self.view.bounds options:nil];
self.arSCNView.session = [[ARSession alloc] init];
// 1. 創(chuàng)建世界追蹤配置焰雕,需要支持A9芯片也就是iPhone6S以上
self.arWordTrackingConfiguration = [[ARWorldTrackingConfiguration alloc] init];
// 2. 設置追蹤方向,追蹤平面
self.arWordTrackingConfiguration.planeDetection = ARPlaneDetectionHorizontal;
self.arWordTrackingConfiguration.lightEstimationEnabled = YES;
在viewDidAppear
時讓session開始工作
[self.arSession runWithConfiguration:self.arWordTrackingConfiguration]
2.2 點擊增加3D圖像
當點擊屏幕的時候加載一個scn文件并且作為childNode添加到self.arSCNView.scene.rootNode
- (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
// 1. 使用場景加載scn文件
SCNScene *scene = [SCNScene sceneNamed:@"art.scnassets/ship.scn"];
SCNNode *shipNode = scene.rootNode.childNodes.firstObject;
shipNode.position = SCNVector3Make(0, -1, -1);
[self.arSCNView.scene.rootNode addChildNode:shipNode];
}
3. ARKit檢測到平面自動增加3D模型
3.檢測到平面增加3D模型.gif
前期準備工作和2.1一樣,只是增加了
self.arSCNView.delegate = self
然后在代理方法
renderer:didAddNode:forAnchor:
中實現(xiàn)以下代碼:
#pragma mark - ARSCNViewDelegate
// 添加節(jié)點的時候調用(當開啟平地捕捉模式之后,如果捕捉到平地葱蝗,ARKit會自動添加一個平地節(jié)點)
- (void)renderer:(id<SCNSceneRenderer>)renderer didAddNode:(SCNNode *)node forAnchor:(ARAnchor *)anchor {
if (![anchor isMemberOfClass:[ARPlaneAnchor class]]) return;
// 添加一個3D平面模型,ARKit只有捕捉能力前弯,錨點只是一個空間位置有勾,想更加清楚看到這個空間,我們需要給控件添加一個平地的3D模型來渲染它
// 1. 獲取捕捉到的平地錨點
ARPlaneAnchor *planeAnchor = (ARPlaneAnchor *)anchor;
// 2. 創(chuàng)建一個3D模型(系統(tǒng)捕捉到的平地是一個不規(guī)則的大小長方形林说,這里筆者q將其變成一個長方形煎殷,并且對平地做了一個縮放效果)
// 參數分別是長、寬腿箩、高豪直、圓角
SCNBox *planeBox = [SCNBox boxWithWidth:planeAnchor.extent.x * 0.3 height:0 length:planeAnchor.extent.x * 0.3 chamferRadius:0];
// 3. 使用Material渲染3D模型(默認模型是白色的)
planeBox.firstMaterial.diffuse.contents = [UIColor clearColor];
// 4. 創(chuàng)建一個基于3D物體模型的節(jié)點
SCNNode *planeNode = [SCNNode nodeWithGeometry:planeBox];
// 5. 設置節(jié)點的位置為捕捉到的平地的錨點的中心位置
// SceneKit中節(jié)點的位置position是一個基于3D坐標系的矢量坐標SCNVector3Make
planeNode.position = SCNVector3Make(planeAnchor.center.x, 0, planeAnchor.center.z);
[node addChildNode:planeNode];
// 6. 創(chuàng)建一個花瓶場景
SCNScene *scene = [SCNScene sceneNamed:@"art.scnassets/vase/vase.scn"];
// 7. 獲取花瓶節(jié)點
// 一個場景有多個節(jié)點,所有場景有且只有一個根節(jié)點珠移,其它所有節(jié)點都是根節(jié)點的子節(jié)點
SCNNode *vaseNode = scene.rootNode.childNodes.firstObject;
// 8. 設置花瓶節(jié)點的位置為捕捉到的平地的位置弓乙,如果不設置末融,則默認為原點位置也就是相機位置
vaseNode.position = SCNVector3Make(planeAnchor.center.x, 0, planeAnchor.center.z);
// 9. 將花瓶節(jié)點添加到屏幕中
// !!!!FBI WARNING: 花瓶節(jié)點是添加到代理捕捉到的節(jié)點中,而不是AR視圖的根接節(jié)點暇韧。
// 因為捕捉到的平地錨點是一個本地坐標系勾习,而不是世界坐標系
[node addChildNode:vaseNode];
}
4. QuickLook的最簡單使用
4.QuickLook簡單使用.gif
這個沒什么好說的蛤肌,直接上代碼
#import "ViewController.h"
#import <QuickLook/QuickLook.h>
#import "WYPreviewItem.h"
@interface ViewController ()<QLPreviewControllerDataSource, QLPreviewControllerDelegate>
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
}
- (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
QLPreviewController *preVC = [[QLPreviewController alloc] init];
preVC.dataSource = self;
preVC.delegate = self;
[self presentViewController:preVC animated:YES completion:nil];
}
#pragma mark - QLPreviewControllerDataSource && QLPreviewControllerDelegate
- (NSInteger)numberOfPreviewItemsInPreviewController:(QLPreviewController *)controller {
return 1;
}
- (id<QLPreviewItem>)previewController:(QLPreviewController *)controller previewItemAtIndex:(NSInteger)index {
return [[NSBundle mainBundle] URLForResource:@"plantpot.usdz" withExtension:nil];
}
- (UIImage *)previewController:(QLPreviewController *)controller transitionImageForPreviewItem:(id<QLPreviewItem>)item contentRect:(CGRect *)contentRect {
CGRect rect = CGRectMake(100, 200, 300, 300);
contentRect = ▭
return [UIImage imageNamed:@"wy.jpeg"];
}
5. ARKit人臉貼圖
5.人臉貼圖.gif
設置session的configuration為ARFaceTrackingConfiguration
,然后在ARSCNView的代理renderer:willUpdateNode:forAnchor
中增加一個SCNNode
核心代碼如下:
- 創(chuàng)建
SCNNode
- 試試看設置fillMesh為YES會怎么樣
- 試試看設置masterial.diffuse.contents為一個顏色會怎么樣
- (SCNNode *)textureMaskNode {
if (!_textureMaskNode) {
id<MTLDevice> device = self.arSCNView.device;
ARSCNFaceGeometry *geometry = [ARSCNFaceGeometry faceGeometryWithDevice:device fillMesh:NO];
SCNMaterial *material = geometry.firstMaterial;
material.fillMode = SCNFillModeFill;
material.diffuse.contents = [UIImage imageNamed:@"wy.jpg"];
_textureMaskNode = [SCNNode nodeWithGeometry:geometry];
}
_textureMaskNode.name = @"textureMask";
return _textureMaskNode;
}
- 添加
SCNNode
并更新人臉特征
- (void)renderer:(id<SCNSceneRenderer>)renderer willUpdateNode:(SCNNode *)node forAnchor:(ARAnchor *)anchor {
if (!anchor || ![anchor isKindOfClass:[ARFaceAnchor class]]) return;
ARFaceAnchor *faceAnchor = (ARFaceAnchor *)anchor;
if (!_textureMaskNode) {
[node addChildNode:self.textureMaskNode];
}
ARSCNFaceGeometry *faceGeometry = (ARSCNFaceGeometry *)self.textureMaskNode.geometry;
if (faceGeometry && [faceGeometry isKindOfClass:[ARSCNFaceGeometry class]]) {
[faceGeometry updateFromFaceGeometry:faceAnchor.geometry];
}
}
6. ARKit微笑檢測
6.微笑檢測.gif
主要用到了ARBlendShapeLocationMouthSmileLeft
和ARBlendShapeLocationMouthSmileRight
表示微笑的鍵值
我提供的demo是用于調試微笑閥值的
核心代碼:
- (void)renderer:(id<SCNSceneRenderer>)renderer didUpdateNode:(SCNNode *)node forAnchor:(ARAnchor *)anchor {
if (!anchor || ![anchor isKindOfClass:[ARFaceAnchor class]]) return;
ARFaceAnchor *faceAnchor = (ARFaceAnchor *)anchor;
NSDictionary *blendShips = faceAnchor.blendShapes;
CGFloat leftSmile = [blendShips[ARBlendShapeLocationMouthSmileLeft] floatValue];
CGFloat rightSmile = [blendShips[ARBlendShapeLocationMouthSmileRight] floatValue];
NSLog(@"leftSmile = %f, rightSmile = %f", leftSmile, rightSmile);
if (leftSmile > self.smileValue && rightSmile > self.smileValue) {
NSLog(@"檢測到笑容");
[self.arSession pause];
dispatch_async(dispatch_get_main_queue(), ^{
self.resultLabel.hidden = NO;
});
}
}
7. ARKit皺眉檢測
7.皺眉檢測.gif
我這里用的是眉毛向上的鍵值ARBlendShapeLocationBrowInnerUp
核心代碼:
- (void)renderer:(id<SCNSceneRenderer>)renderer didUpdateNode:(SCNNode *)node forAnchor:(ARAnchor *)anchor {
if (!anchor && ![anchor isKindOfClass:[ARFaceAnchor class]]) return;
ARFaceAnchor *faceAnchor = (ARFaceAnchor *)anchor;
NSDictionary *blendShapes = faceAnchor.blendShapes;
NSNumber *browInnerUp = blendShapes[ARBlendShapeLocationBrowInnerUp];
if ([browInnerUp floatValue] > self.browValue) {
[self.arSession pause];
dispatch_async(dispatch_get_main_queue(), ^{
self.resultLabel.hidden = NO;
});
}
NSLog(@"browInnerUp = %@", browInnerUp);
}
8. BlendShapes
- 僅在iOS11及以上可用蹄咖,每個參數的詳細介紹和圖片對比可以打開Xcode->Window->Developer Documentation,然后搜索對應的鍵值即可
- 每個建對應的值都是0~1的值
-
共51個表示人臉特征的參數
屬性 | 說明 | 備注 |
---|---|---|
ARBlendShapeLocationBrowDownLeft | 左眉毛外部向下 | |
ARBlendShapeLocationBrowDownRight | 右眉毛外部向下 | |
ARBlendShapeLocationBrowInnerUp | 兩眉毛內部向上 | |
ARBlendShapeLocationBrowOuterUpLeft | 左眉毛外部向上 | |
ARBlendShapeLocationBrowOuterUpRight | 右眉毛外部向上 | |
ARBlendShapeLocationCheekPuff | 兩個臉頰向外 | |
ARBlendShapeLocationCheekSquintLeft | 左眼向下斜視 | |
ARBlendShapeLocationCheekSquintRight | 右眼向下斜視 | |
ARBlendShapeLocationEyeBlinkLeft | 眨左眼 | |
ARBlendShapeLocationEyeBlinkRight | 眨右眼 | |
ARBlendShapeLocationEyeLookDownLeft | 左眼瞼運動的系數與向下凝視一致 | |
ARBlendShapeLocationEyeLookDownRight | 右眼瞼運動的系數與向下凝視一致 | |
ARBlendShapeLocationEyeLookInLeft | 左眼瞼運動的系數與向右凝視一致。 | |
ARBlendShapeLocationEyeLookInRight | 右眼瞼運動的系數與向左凝視一致霜威。 | |
ARBlendShapeLocationEyeLookOutLeft | 左眼瞼運動的系數與向左凝視一致 | |
ARBlendShapeLocationEyeLookOutRight | 右眼瞼運動的系數與向右凝視一致 | |
ARBlendShapeLocationEyeSquintLeft | 左眼臉部收縮 | |
ARBlendShapeLocationEyeSquintRight | 右眼臉部收縮 | |
ARBlendShapeLocationEyeWideLeft | 左眼周圍眼瞼變寬 | |
ARBlendShapeLocationEyeWideRight | 右眼周圍眼瞼變寬 | |
ARBlendShapeLocationJawForward | 下頜向前運動 | |
ARBlendShapeLocationJawLeft | 下頜向左運動 | |
ARBlendShapeLocationJawOpen | 下頜開口 | |
ARBlendShapeLocationJawRight | 下頜向右運動 | |
ARBlendShapeLocationMouthClose | 嘴唇閉合的系數與頜位置無關 | |
ARBlendShapeLocationMouthDimpleLeft | 嘴左角后移 | |
ARBlendShapeLocationMouthDimpleRight | 嘴右角后移 | |
ARBlendShapeLocationMouthFrownLeft | 嘴左角向下運動 | |
ARBlendShapeLocationMouthFrownRight | 嘴右角向下運動 | |
ARBlendShapeLocationMouthFunnel | 兩個嘴唇收縮成開放形狀 | |
ARBlendShapeLocationMouthLeft | 兩個嘴唇向左移動 | |
ARBlendShapeLocationMouthLowerDownLeft | 左側下唇向下運動 | |
ARBlendShapeLocationMouthLowerDownRight | 又側下唇向下運動 | |
ARBlendShapeLocationMouthPressLeft | 左側下唇向上壓縮 | |
ARBlendShapeLocationMouthPressRight | 右側下唇向上壓縮 | |
ARBlendShapeLocationMouthPucker | 兩個閉合嘴唇的收縮和壓縮 | |
ARBlendShapeLocationMouthRight | 兩個嘴唇向右運動 | |
ARBlendShapeLocationMouthRollLower | 下唇向嘴內側移動 | |
ARBlendShapeLocationMouthRollUpper | 上唇向嘴內側移動 | |
ARBlendShapeLocationMouthShrugLower | 下唇向外運動 | |
ARBlendShapeLocationMouthShrugUpper | 上唇向外運動 | |
ARBlendShapeLocationMouthSmileLeft | 嘴左角向上運動 | |
ARBlendShapeLocationMouthSmileRight | 嘴右角向上運動 | |
ARBlendShapeLocationMouthStretchLeft | 嘴左角向左移動 | |
ARBlendShapeLocationMouthStretchRight | 嘴左角向右移動 | |
ARBlendShapeLocationMouthUpperUpLeft | 左側上唇向上運動 | |
ARBlendShapeLocationMouthUpperUpRight | 右側上唇向上運動 | |
ARBlendShapeLocationNoseSneerLeft | 左鼻孔抬高 | |
ARBlendShapeLocationNoseSneerRight | 右鼻孔抬高 | |
ARBlendShapeLocationTongueOut | 舌頭延伸 |
ARBlendShapeLocationMouthClose