ARKit學習-7

轉(zhuǎn)載請注明出處
Apple原文地址: https://developer.apple.com/documentation/arkit/displaying_an_ar_experience_with_metal


Displaying an AR Experience with Metal(使用 Metal 來展示 AR 場景)

Build a custom AR view by rendering camera images and using position-tracking information to display overlay content.
通過渲染相機圖像,以及使用位置追蹤信息來展示覆蓋(overlay)物,從而來構建自定義的 AR 視圖場景蝌蹂。


Overview

  • ARKit includes view classes for easily displaying AR experiences with SceneKit or SpriteKit. However, if you instead build your own rendering engine (or integrate with a third-party engine), ARKit also provides all the support necessary to display an AR experience with a custom view.
  • ARKit 中內(nèi)置了一些視圖類噩凹,從而能夠輕松地用 SceneKit 或者 SpriteKit 來展示 AR 場景。然而舵盈,如果您使用的是自己的渲染引擎(或者集成了第三方引擎)陋率,那么 ARKit 還提供了自定義視圖以及其他的支持環(huán)境球化,來展示 AR 場景。
image.png
  • In any AR experience, the first step is to configure an ARSession
    object to manage camera capture and motion processing. A session defines and maintains a correspondence between the real-world space the device inhabits and a virtual space where you model AR content. To display your AR experience in a custom view, you’ll need to:
  • 在所有的 AR 場景中瓦糟,首先就是要配置一個 ARSession**對象筒愚,用來管理攝像頭拍攝和對設備動作進行處理。Session 定義并維護現(xiàn)實空間和虛擬空間之間的關聯(lián)關系菩浙,其中巢掺,現(xiàn)實空間是用戶所處的世界,虛擬空間是可對可視化內(nèi)容進行建模的世界芍耘。如果要在自定義視圖當中展示您的 AR 場景的話址遇,那么您需要:

  • 1.Retrieve video frames and tracking information from the session.

  • 1.從 Session 中檢索視頻幀和追蹤信息

  • 2.Render those frame images as the backdrop for your view.

  • 2.將這些幀圖像作為背景,渲染到自定義視圖當中

  • 3.Use the tracking information to position and draw AR content atop the camera image.

  • 3.使用追蹤信息斋竞,從而在相機圖像上方定位并繪制 AR 內(nèi)容

Note/注意
This article covers code found in Xcode project templates. For complete example code, create a new iOS application with the Augmented Reality template, and choose Metal from the Content Technology popup menu.
本文所涉及的代碼均可以在 Xcode 項目模板當中找到倔约。如果要獲取完整的示例代碼,請使用 “Augmented Reality” 模板來創(chuàng)建一個新的 iOS 應用坝初,然后在彈出的 Content Technology 菜單當中選擇 “Metal”

Get Video Frames and Tracking Data from the Session(從 Session 中獲取視頻幀和追蹤數(shù)據(jù))

  • Create and maintain your own ARSession
    instance, and run it with a session configuration appropriate for the kind of AR experience you want to support. (To do this, see Building a Basic AR Experience.) The session captures video from the camera, tracks the device’s position and orientation in a modeled 3D space, and provides ARFrame
    objects. Each such object contains both an individual video frame image and position tracking information from the moment that frame was captured.
  • 請自行創(chuàng)建并維護 ARSession實例浸剩,然后根據(jù)您所希望提供的 AR 場景類型,使用合適的 Session 配置來運行這個實例鳄袍。(要實現(xiàn)這點的話绢要,請參閱「構建基本的 AR 場景」。)Session 從攝像機當中捕獲視頻拗小,然后在建模的 3D 空間中追蹤設備的位置和方向重罪,并提供 ARFrame對象。每個 ARFrame對象都包含有單獨的視頻幀 (frame) 圖像和被捕獲時的設備位置追蹤信息哀九。

  • There are two ways to access ARFrame
    objects produced by an AR session, depending on whether your app favors a pull or a push design pattern.

  • 要訪問 AR Session 中生成的 ARFrame對象的話剿配,有以下兩種方法,使用何種方法取決于您應用的設計模式是偏好主動拉取 (pull) 還是被動推送 (push)阅束。

  • If you prefer to control frame timing (the pull design pattern), use the session’s currentFrame
    property to get the current frame image and tracking information each time you redraw your view’s contents. The ARKit Xcode template uses this approach:
  • 如果您傾向于定時獲取視頻幀的話(也就是主動拉取設計模式)呼胚,那么請使用 Session 的 currentFrame屬性,這樣就可以在每次重繪視圖內(nèi)容的時候息裸,獲取當前的幀圖像和追蹤信息蝇更。ARKit Xcode 模板使用了如下方法:
// in Renderer class, called from MTKViewDelegate.draw(in:) via Renderer.update()
func updateGameState() {        
    guard let currentFrame = session.currentFrame else {
        return
    }
    
    updateSharedUniforms(frame: currentFrame)
    updateAnchors(frame: currentFrame)
    updateCapturedImageTextures(frame: currentFrame)
    
    if viewportSizeDidChange {
        viewportSizeDidChange = false
        
        updateImagePlane(frame: currentFrame)
    }
}
  • Alternatively, if your app design favors a push pattern, implement the session(_:didUpdate:)
    delegate method, and the session will call it once for each video frame it captures (at 60 frames per second by default).
  • 相反,如果您的應用設計傾向于使用被動推送模式的話呼盆,那么請實現(xiàn)session(_:didUpdate:)代理方法年扩,當每個視頻幀被捕獲之后,Session 就會調(diào)用這個代理方法(默認每秒捕獲 60 幀)宿亡。

  • Upon obtaining a frame, you’ll need to draw the camera image, and update and render any overlay content your AR experience includes.

  • 獲得一個視頻幀之后常遂,您就需要繪制相機圖像了,然后將 AR 場景中包含的所有覆蓋物進行更新和展示。

Draw the Camera Image(繪制相機圖像)

  • Each ARFrame
    object’s capturedImage
    property contains a pixel buffer captured from the device camera. To draw this image as the backdrop for your custom view, you’ll need to create textures from the image content and submit GPU rendering commands that use those textures.
  • 每個 ARFrame對象的 capturedImage屬性都包含了從設備相機中捕獲的像素緩沖區(qū) (pixel buffer)克胳。要將這個圖像作為背景繪制到自定義視圖當中平绩,您需要從圖像內(nèi)容中構建紋理 (texture),然后提交使用這些紋理進行 GPU 渲染的命令漠另。

  • The pixel buffer’s contents are encoded in a biplanar YCbCr (also called YUV) data format; to render the image you’ll need to convert this pixel data to a drawable RGB format. For rendering with Metal, you can perform this conversion most efficiently in GPU shader code. Use CVMetalTextureCache APIs to create two Metal textures from the pixel buffer—one each for the buffer’s luma (Y) and chroma (CbCr) planes:

  • 像素緩沖區(qū)的內(nèi)容將被編碼為雙面 (biplanar) YCbCr 數(shù)據(jù)格式(也成為 YUV)捏雌;要渲染圖像的話,您需要將這些像素數(shù)據(jù)轉(zhuǎn)換為可繪制的 RGB 格式笆搓。對于 Metal 渲染而言性湿,最高效的方法便是使用 GPU 著色代碼 (shader code) 來執(zhí)行這個轉(zhuǎn)換了。借助CVMetalTextureCache**API满败,可以從像素緩沖區(qū)中生成兩個 Metal 紋理——一個用于決定緩沖區(qū)的亮度 (Y)肤频,一個用于決定緩沖區(qū)的色度 (CbCr) 面。

func updateCapturedImageTextures(frame: ARFrame) {
    // Create two textures (Y and CbCr) from the provided frame's captured image
    //從所提供的視頻幀中算墨,根據(jù)其中所捕獲的圖像宵荒,創(chuàng)建兩個紋理 (Y and CbCr)
    let pixelBuffer = frame.capturedImage
    if (CVPixelBufferGetPlaneCount(pixelBuffer) < 2) {
        return
    }
    capturedImageTextureY = createTexture(fromPixelBuffer: pixelBuffer, pixelFormat:.r8Unorm, planeIndex:0)!
    capturedImageTextureCbCr = createTexture(fromPixelBuffer: pixelBuffer, pixelFormat:.rg8Unorm, planeIndex:1)!
}

func createTexture(fromPixelBuffer pixelBuffer: CVPixelBuffer, pixelFormat: MTLPixelFormat, planeIndex: Int) -> MTLTexture? {
    var mtlTexture: MTLTexture? = nil
    let width = CVPixelBufferGetWidthOfPlane(pixelBuffer, planeIndex)
    let height = CVPixelBufferGetHeightOfPlane(pixelBuffer, planeIndex)
    
    var texture: CVMetalTexture? = nil
    let status = CVMetalTextureCacheCreateTextureFromImage(nil, capturedImageTextureCache, pixelBuffer, nil, pixelFormat, width, height, planeIndex, &texture)
    if status == kCVReturnSuccess {
        mtlTexture = CVMetalTextureGetTexture(texture!)
    }
    
    return mtlTexture
}
  • Next, encode render commands that draw those two textures using a fragment function that performs YCbCr to RGB conversion with a color transform matrix:
  • 接下來,使用借助顏色變換矩陣將 YCbCr 轉(zhuǎn)換為 RGB 的函數(shù)片段净嘀,完成這兩個紋理的繪制报咳,我們這里將整個渲染命令進行編碼。
fragment float4 capturedImageFragmentShader(ImageColorInOut in [[stage_in]],
                                            texture2d<float, access::sample> capturedImageTextureY [[ texture(kTextureIndexY) ]],
                                            texture2d<float, access::sample> capturedImageTextureCbCr [[ texture(kTextureIndexCbCr) ]]) {
    
    constexpr sampler colorSampler(mip_filter::linear,
                                   mag_filter::linear,
                                   min_filter::linear);
    
    const float4x4 ycbcrToRGBTransform = float4x4(
        float4(+1.164380f, +1.164380f, +1.164380f, +0.000000f),
        float4(+0.000000f, -0.391762f, +2.017230f, +0.000000f),
        float4(+1.596030f, -0.812968f, +0.000000f, +0.000000f),
        float4(-0.874202f, +0.531668f, -1.085630f, +1.000000f)
    );
    
    // Sample Y and CbCr textures to get the YCbCr color at the given texture coordinate
    float4 ycbcr = float4(capturedImageTextureY.sample(colorSampler, in.texCoord).r,
                          capturedImageTextureCbCr.sample(colorSampler, in.texCoord).rg, 1.0);
    
    // Return converted RGB color
    return ycbcrToRGBTransform * ycbcr;
}

Note / 注意

  • Use the displayTransform(withViewportSize:orientation:)
    method to make sure the camera image covers the entire view. For example use of this method, as well as complete Metal pipeline setup code, see the full Xcode template. (Create a new iOS application with the Augmented Reality template, and choose Metal from the Content Technology popup menu.)
  • 請使用 displayTransform(withViewportSize:orientation:)方法來確保整個相機圖像完全覆蓋了整個視圖挖藏。關于如何使用這個方法暑刃,以及完整的 Metal 管道配置代碼,請參閱完整的 Xcode 模板膜眠。(請使用 “Augmented Reality” 模板來創(chuàng)建一個新的 iOS 應用岩臣,然后在彈出的 Content Technology 菜單當中選擇 “Metal”。)

Track and Render Overlay Content(追蹤并渲染覆蓋內(nèi)容)

  • AR experiences typically focus on rendering 3D overlay content so that the content appears to be part of the real world seen in the camera image. To achieve this illusion, use the ARAnchor
    class to model the position and orientation of your own 3D content relative to real-world space. Anchors provide transforms that you can reference during rendering.
  • AR 場景通常側(cè)重于渲染 3D 覆蓋物宵膨,使得這些內(nèi)容似乎是從相機中所看到的真實世界的一部分婿脸。為了實現(xiàn)這種效果,我們使用 ARAnchor**類柄驻,來對 3D 內(nèi)容相對于現(xiàn)實世界空間的位置和方向進行建模。錨點提供了變換 (transform) 屬性焙压,在渲染的時候可供參考鸿脓。

  • For example, the Xcode template creates an anchor located about 20 cm in front of the device whenever a user taps on the screen:

  • 舉個例子,當用戶點擊屏幕的時候涯曲,Xcode 模板會在設備前方大約 20 厘米處野哭,創(chuàng)建一個錨點。

func handleTap(gestureRecognize: UITapGestureRecognizer) {
    // Create anchor using the camera's current position
    if let currentFrame = session.currentFrame {
        
        // Create a transform with a translation of 0.2 meters in front of the camera
        var translation = matrix_identity_float4x4
        translation.columns.3.z = -0.2
        let transform = simd_mul(currentFrame.camera.transform, translation)
        
        // Add a new anchor to the session
        let anchor = ARAnchor(transform: transform)
        session.add(anchor: anchor)
    }
}
  • In your rendering engine, use the transform
    property of each ARAnchor
    object to place visual content. The Xcode template uses each of the anchors added to the session in its handleTap
    method to position a simple cube mesh:
  • 在您的渲染引擎當中幻件,使用每個 ARAnchor對象的 transform屬性來放置虛擬內(nèi)容拨黔。Xcode 模板在內(nèi)部的 handleTap方法中,使用添加到 Session 當中每個錨點來定位一個簡單的立方體網(wǎng)格 (cube mesh):
func updateAnchors(frame: ARFrame) {
    // Update the anchor uniform buffer with transforms of the current frame's anchors
    anchorInstanceCount = min(frame.anchors.count, kMaxAnchorInstanceCount)
    
    var anchorOffset: Int = 0
    if anchorInstanceCount == kMaxAnchorInstanceCount {
        anchorOffset = max(frame.anchors.count - kMaxAnchorInstanceCount, 0)
    }
    
    for index in 0..<anchorInstanceCount {
        let anchor = frame.anchors[index + anchorOffset]
        
        // Flip Z axis to convert geometry from right handed to left handed
        var coordinateSpaceTransform = matrix_identity_float4x4
        coordinateSpaceTransform.columns.2.z = -1.0
        
        let modelMatrix = simd_mul(anchor.transform, coordinateSpaceTransform)
        
        let anchorUniforms = anchorUniformBufferAddress.assumingMemoryBound(to: InstanceUniforms.self).advanced(by: index)
        anchorUniforms.pointee.modelMatrix = modelMatrix
    }
}

Note / 注意

  • In a more complex AR experience, you can use hit testing or plane detection to find the positions of real-world surfaces. For details, see the planeDetection
    property and the hitTest(_:types:)
    method. In both cases, ARKit provides results as ARAnchor
    objects, so you still use anchor transforms to place visual content.
  • 在更為復雜的 AR 場景中绰沥,您可以使用點擊測試或者水平面檢測篱蝇,來尋找真實世界當中曲面的位置贺待。要了解關于此內(nèi)容的詳細信息,請參閱 planeDetection屬性和hitTest(_:types:)方法零截。對于這兩者而言麸塞,ARKit 都會生成 ARAnchor對象作為結(jié)果,因此您仍然需要使用錨點的 transform 屬性來放置虛擬內(nèi)容涧衙。

Render with Realistic Lighting(根據(jù)實際光照度進行渲染)

  • When you configure shaders for drawing 3D content in your scene, use the estimated lighting information in each ARFrame
    object to produce more realistic shading:
  • 當您在場景中配置用于繪制 3D 內(nèi)容的著色器時哪工,請使用每個 ARFrame**對象當中的預計光照度信息,來產(chǎn)生更為逼真的陰影:
// in Renderer.updateSharedUniforms(frame:):
// Set up lighting for the scene using the ambient intensity if provided
var ambientIntensity: Float = 1.0
if let lightEstimate = frame.lightEstimate {
    ambientIntensity = Float(lightEstimate.ambientIntensity) / 1000.0
}
let ambientLightColor: vector_float3 = vector3(0.5, 0.5, 0.5)
uniforms.pointee.ambientLightColor = ambientLightColor * ambientIntensity

Note / 注意

  • For the complete set of Metal setup and rendering commands that go with this example, see the see the full Xcode template. (Create a new iOS application with the Augmented Reality template, and choose Metal from the Content Technology popup menu.)
  • 要了解該示例中的全部 Metal 配置弧哎,以及所使用的渲染命令雁比,請參見完整的 Xcode 模板。(請使用 “Augmented Reality” 模板來創(chuàng)建一個新的 iOS 應用撤嫩,然后在彈出的 Content Technology 菜單當中選擇 “Metal”偎捎。
最后編輯于
?著作權歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末,一起剝皮案震驚了整個濱河市非洲,隨后出現(xiàn)的幾起案子鸭限,更是在濱河造成了極大的恐慌,老刑警劉巖两踏,帶你破解...
    沈念sama閱讀 216,997評論 6 502
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件败京,死亡現(xiàn)場離奇詭異,居然都是意外死亡梦染,警方通過查閱死者的電腦和手機赡麦,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 92,603評論 3 392
  • 文/潘曉璐 我一進店門,熙熙樓的掌柜王于貴愁眉苦臉地迎上來帕识,“玉大人泛粹,你說我怎么就攤上這事“沽疲” “怎么了晶姊?”我有些...
    開封第一講書人閱讀 163,359評論 0 353
  • 文/不壞的土叔 我叫張陵,是天一觀的道長伪货。 經(jīng)常有香客問我们衙,道長,這世上最難降的妖魔是什么碱呼? 我笑而不...
    開封第一講書人閱讀 58,309評論 1 292
  • 正文 為了忘掉前任蒙挑,我火速辦了婚禮,結(jié)果婚禮上愚臀,老公的妹妹穿的比我還像新娘忆蚀。我一直安慰自己,他們只是感情好,可當我...
    茶點故事閱讀 67,346評論 6 390
  • 文/花漫 我一把揭開白布馋袜。 她就那樣靜靜地躺著男旗,像睡著了一般。 火紅的嫁衣襯著肌膚如雪桃焕。 梳的紋絲不亂的頭發(fā)上剑肯,一...
    開封第一講書人閱讀 51,258評論 1 300
  • 那天,我揣著相機與錄音观堂,去河邊找鬼让网。 笑死,一個胖子當著我的面吹牛师痕,可吹牛的內(nèi)容都是我干的溃睹。 我是一名探鬼主播,決...
    沈念sama閱讀 40,122評論 3 418
  • 文/蒼蘭香墨 我猛地睜開眼胰坟,長吁一口氣:“原來是場噩夢啊……” “哼因篇!你這毒婦竟也來了?” 一聲冷哼從身側(cè)響起笔横,我...
    開封第一講書人閱讀 38,970評論 0 275
  • 序言:老撾萬榮一對情侶失蹤竞滓,失蹤者是張志新(化名)和其女友劉穎,沒想到半個月后吹缔,有當?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體商佑,經(jīng)...
    沈念sama閱讀 45,403評論 1 313
  • 正文 獨居荒郊野嶺守林人離奇死亡,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點故事閱讀 37,596評論 3 334
  • 正文 我和宋清朗相戀三年厢塘,在試婚紗的時候發(fā)現(xiàn)自己被綠了茶没。 大學時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片。...
    茶點故事閱讀 39,769評論 1 348
  • 序言:一個原本活蹦亂跳的男人離奇死亡晚碾,死狀恐怖抓半,靈堂內(nèi)的尸體忽然破棺而出,到底是詐尸還是另有隱情格嘁,我是刑警寧澤笛求,帶...
    沈念sama閱讀 35,464評論 5 344
  • 正文 年R本政府宣布,位于F島的核電站糕簿,受9級特大地震影響涣易,放射性物質(zhì)發(fā)生泄漏。R本人自食惡果不足惜冶伞,卻給世界環(huán)境...
    茶點故事閱讀 41,075評論 3 327
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望步氏。 院中可真熱鬧响禽,春花似錦、人聲如沸。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,705評論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽。三九已至侯繁,卻和暖如春胖喳,著一層夾襖步出監(jiān)牢的瞬間,已是汗流浹背。 一陣腳步聲響...
    開封第一講書人閱讀 32,848評論 1 269
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機就差點兒被人妖公主榨干…… 1. 我叫王不留萝衩,地道東北人刃唤。 一個月前我還...
    沈念sama閱讀 47,831評論 2 370
  • 正文 我出身青樓,卻偏偏與公主長得像翩蘸,于是被迫代替她去往敵國和親。 傳聞我的和親對象是個殘疾皇子,可洞房花燭夜當晚...
    茶點故事閱讀 44,678評論 2 354

推薦閱讀更多精彩內(nèi)容