點(diǎn)擊獲取本文示例代碼, 本文代碼在分支ARKit中您朽。
如果你想了解ATRKit的基礎(chǔ)知識(shí),請(qǐng)?jiān)L問ARKit & OpenGL ES - ARKit原理篇
如果你想了解更多關(guān)于OpenGL ES的知識(shí)个少,請(qǐng)移步至OpenGL ES相關(guān)文章目錄
本文所用OpenGL基礎(chǔ)代碼來自OpenGL ES系列,具備渲染幾何體潭辈,紋理等基礎(chǔ)功能线罕,實(shí)現(xiàn)細(xì)節(jié)將不贅述。
集成ARKit的關(guān)鍵代碼都在ARGLBaseViewController
中般哼。我們來看一下它的代碼吴汪。
處理ARFrame
- (void)session:(ARSession *)session didUpdateFrame:(ARFrame *)frame {
// 同步Y(jié)UV信息到 yTexture 和 uvTexture
CVPixelBufferRef pixelBuffer = frame.capturedImage;
GLsizei imageWidth = (GLsizei)CVPixelBufferGetWidthOfPlane(pixelBuffer, 0);
GLsizei imageHeight = (GLsizei)CVPixelBufferGetHeightOfPlane(pixelBuffer, 0);
void * baseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
glBindTexture(GL_TEXTURE_2D, self.yTexture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, imageWidth, imageHeight, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, baseAddress);
glBindTexture(GL_TEXTURE_2D, 0);
imageWidth = (GLsizei)CVPixelBufferGetWidthOfPlane(pixelBuffer, 1);
imageHeight = (GLsizei)CVPixelBufferGetHeightOfPlane(pixelBuffer, 1);
void *laAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
glBindTexture(GL_TEXTURE_2D, self.uvTexture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE_ALPHA, imageWidth, imageHeight, 0, GL_LUMINANCE_ALPHA, GL_UNSIGNED_BYTE, laAddress);
glBindTexture(GL_TEXTURE_2D, 0);
self.videoPlane.yuv_yTexture = self.yTexture;
self.videoPlane.yuv_uvTexture = self.uvTexture;
[self setupViewport: CGSizeMake(imageHeight, imageWidth)];
// 同步攝像機(jī)
matrix_float4x4 cameraMatrix = matrix_invert([frame.camera transform]);
GLKMatrix4 newCameraMatrix = GLKMatrix4Identity;
for (int col = 0; col < 4; ++col) {
for (int row = 0; row < 4; ++row) {
newCameraMatrix.m[col * 4 + row] = cameraMatrix.columns[col][row];
}
}
self.cameraMatrix = newCameraMatrix;
GLKVector3 forward = GLKVector3Make(-self.cameraMatrix.m13, -self.cameraMatrix.m23, -self.cameraMatrix.m33);
GLKMatrix4 rotationMatrix = GLKMatrix4MakeRotation(M_PI / 2, forward.x, forward.y, forward.z);
self.cameraMatrix = GLKMatrix4Multiply(rotationMatrix, newCameraMatrix);
}
上面的代碼展示了如何處理ARKit捕捉的ARFrame,ARFrame
的capturedImage
存儲(chǔ)了攝像頭捕捉的圖片信息蒸眠,類型是CVPixelBufferRef
漾橙。默認(rèn)情況下,圖片信息的格式是YUV楞卡,通過兩個(gè)Plane
來存儲(chǔ)霜运,也可以理解為兩張圖片。一張格式是Y(Luminance)蒋腮,保存了明度信息淘捡,另一張是UV(Chrominance、Chroma)池摧,保存了色度和濃度焦除。我們需要把這兩張圖分別綁定到不同的紋理上,然后在Shader中利用算法將YUV轉(zhuǎn)換成RGB作彤。下面是處理紋理的Fragment Shader膘魄,利用公式進(jìn)行顏色轉(zhuǎn)換。
precision highp float;
varying vec3 fragNormal;
varying vec2 fragUV;
uniform float elapsedTime;
uniform mat4 normalMatrix;
uniform sampler2D yMap;
uniform sampler2D uvMap;
void main(void) {
vec4 Y_planeColor = texture2D(yMap, fragUV);
vec4 CbCr_planeColor = texture2D(uvMap, fragUV);
float Cb, Cr, Y;
float R ,G, B;
Y = Y_planeColor.r * 255.0;
Cb = CbCr_planeColor.r * 255.0 - 128.0;
Cr = CbCr_planeColor.a * 255.0 - 128.0;
R = 1.402 * Cr + Y;
G = -0.344 * Cb - 0.714 * Cr + Y;
B = 1.772 * Cb + Y;
vec4 videoColor = vec4(R / 255.0, G / 255.0, B / 255.0, 1.0);
gl_FragColor = videoColor;
}
處理并綁定好紋理后宦棺,為了保證不同屏幕尺寸下瓣距,紋理不被非等比拉伸,所以對(duì)viewport
進(jìn)行重了新計(jì)算[self setupViewport: CGSizeMake(imageHeight, imageWidth)];
代咸。接下來將ARKit計(jì)算出來的攝像機(jī)的變換賦值給self.cameraMatrix
蹈丸。注意ARKit捕捉的圖片需要旋轉(zhuǎn)90度后才能正常顯示,所以在設(shè)置Viewport時(shí)特意顛倒了寬和高,并在最后對(duì)攝像機(jī)進(jìn)行了旋轉(zhuǎn)逻杖。
VideoPlane
VideoPlane是為了顯示視頻編寫的幾何體奋岁,它能夠接收兩個(gè)紋理,Y和UV荸百。
@interface VideoPlane : GLObject
@property (assign, nonatomic) GLuint yuv_yTexture;
@property (assign, nonatomic) GLuint yuv_uvTexture;
- (instancetype)initWithGLContext:(GLContext *)context;
- (void)update:(NSTimeInterval)timeSinceLastUpdate;
- (void)draw:(GLContext *)glContext;
@end
...
- (void)draw:(GLContext *)glContext {
[glContext setUniformMatrix4fv:@"modelMatrix" value:self.modelMatrix];
bool canInvert;
GLKMatrix4 normalMatrix = GLKMatrix4InvertAndTranspose(self.modelMatrix, &canInvert);
[glContext setUniformMatrix4fv:@"normalMatrix" value:canInvert ? normalMatrix : GLKMatrix4Identity];
[glContext bindTextureName:self.yuv_yTexture to:GL_TEXTURE0 uniformName:@"yMap"];
[glContext bindTextureName:self.yuv_uvTexture to:GL_TEXTURE1 uniformName:@"uvMap"];
[glContext drawTrianglesWithVAO:vao vertexCount:6];
}
其他的功能很簡(jiǎn)單闻伶,就是繪制一個(gè)正方形,最終配合顯示視頻的Shader够话,渲染YUV格式的數(shù)據(jù)蓝翰。
透視投影矩陣
在ARFrame可以獲取渲染需要的紋理和攝像機(jī)矩陣,除了這些女嘲,和真實(shí)攝像頭匹配的透視投影矩陣也是必須的畜份。它能夠讓渲染出來的3D物體透視看起來很自然。
- (void)session:(ARSession *)session cameraDidChangeTrackingState:(ARCamera *)camera {
matrix_float4x4 projectionMatrix = [camera projectionMatrixWithViewportSize:self.viewport.size orientation:UIInterfaceOrientationPortrait zNear:0.1 zFar:1000];
GLKMatrix4 newWorldProjectionMatrix = GLKMatrix4Identity;
for (int col = 0; col < 4; ++col) {
for (int row = 0; row < 4; ++row) {
newWorldProjectionMatrix.m[col * 4 + row] = projectionMatrix.columns[col][row];
}
}
self.worldProjectionMatrix = newWorldProjectionMatrix;
}
上面的代碼演示了如何通過ARKit獲取3D透視投影矩陣欣尼,有了透視投影矩陣和攝像機(jī)矩陣爆雹,就可以很方便的利用OpenGL渲染物體了。
- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect {
[super glkView:view drawInRect:rect];
[self.objects enumerateObjectsUsingBlock:^(GLObject *obj, NSUInteger idx, BOOL *stop) {
[obj.context active];
[obj.context setUniform1f:@"elapsedTime" value:(GLfloat)self.elapsedTime];
[obj.context setUniformMatrix4fv:@"projectionMatrix" value:self.worldProjectionMatrix];
[obj.context setUniformMatrix4fv:@"cameraMatrix" value:self.cameraMatrix];
[obj.context setUniform3fv:@"lightDirection" value:self.lightDirection];
[obj draw:obj.context];
}];
}
本文主要介紹了OpenGL ES渲染ARKit的基本思路愕鼓,沒有對(duì)OpenGL ES技術(shù)細(xì)節(jié)描述太多钙态。如果你有興趣,可以直接閱讀示例中的代碼深入了解菇晃。