截屏 & 錄制
可能是由于SceneKit原本是設(shè)計(jì)用來做游戲渲染的框架栗弟,只提供了一個(gè)截屏的接口snapshot,拍照尚可調(diào)用工闺,而錄制并不是特別方便乍赫。
首先有一個(gè)view:
@IBOutlet var sceneView: ARSCNView!
截屏
ps:截屏中可以看到AR物體
截屏:獲取UIImage對(duì)象
let re = SCNRenderer(device: sceneView.device, options: nil)
re.scene = sceneView.scene
let image = re.snapshot(atTime: CACurrentMediaTime(), with: UIScreen.main.bounds.size, antialiasingMode: .multisampling4X)
錄制
要設(shè)置session的代理,代理中返回?cái)?shù)據(jù)
ps:錄制中不可以看到AR物體陆蟆,只有攝像頭畫面
sceneView.session.delegate = self
func session(_ session: ARSession, didUpdate frame: ARFrame) {
let cvBuffer = frame.capturedImage
}
錄制中如果需要同時(shí)有AR物體雷厂,還有另外一種錄制方法
//ARSCNView的代理
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
renderer.delegate = self
}
//SCNSceneRenderer的代理
func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
if renderer.currentRenderPassDescriptor.colorAttachments[0].texture == nil {
return
}
let t = renderer.currentRenderPassDescriptor.colorAttachments[0].texture!
var outPixelbuffer: Unmanaged<CVPixelBuffer>?
if t.iosurface == nil {
return
}
CVPixelBufferCreateWithIOSurface(kCFAllocatorDefault, t.iosurface!, nil, &outPixelbuffer)
let buffer = outPixelbuffer?.takeRetainedValue()
}