OpenGL ES _ 入門(mén)_01
OpenGL ES _ 入門(mén)_02
OpenGL ES _ 入門(mén)_03
OpenGL ES _ 入門(mén)_04
OpenGL ES _ 入門(mén)_05
OpenGL ES _ 入門(mén)練習(xí)_01
OpenGL ES _ 入門(mén)練習(xí)_02
OpenGL ES _ 入門(mén)練習(xí)_03
OpenGL ES _ 入門(mén)練習(xí)_04
OpenGL ES _ 入門(mén)練習(xí)_05
OpenGL ES _ 入門(mén)練習(xí)_06
OpenGL ES _ 著色器 _ 介紹
OpenGL ES _ 著色器 _ 程序
OpenGL ES _ 著色器 _ 語(yǔ)法
OpenGL ES_著色器_紋理圖像
OpenGL ES_著色器_預(yù)處理
OpenGL ES_著色器_頂點(diǎn)著色器詳解
OpenGL ES_著色器_片斷著色器詳解
OpenGL ES_著色器_實(shí)戰(zhàn)01
OpenGL ES_著色器_實(shí)戰(zhàn)02
OpenGL ES_著色器_實(shí)戰(zhàn)03
實(shí)戰(zhàn)2中贯卦,詳細(xì)介紹了多屏顯示的原理和實(shí)現(xiàn)過(guò)程,今天我們繼續(xù)我們的OpenGL 旅程句灌!技術(shù)再牛逼也要學(xué)習(xí)!
學(xué)習(xí)目標(biāo)
打造全景視頻,以及VR 眼鏡專用的雙屏顯示框架!
你應(yīng)該知道的
- 全景顯示的原理
通俗的將八回,好比紅色區(qū)域就是你的手機(jī)屏幕,當(dāng)你旋轉(zhuǎn)手機(jī)的時(shí)候置鼻,我們球體向相反的方向旋轉(zhuǎn)诲宇,這樣,你就可以看到球體上的畫(huà)面了.
準(zhǔn)備工作
找一個(gè)全景視頻说莫,添加到項(xiàng)目中去。
- 實(shí)現(xiàn)步驟
1.創(chuàng)建一個(gè)球體模型
2.獲取視頻數(shù)據(jù)的每一幀數(shù)據(jù) 轉(zhuǎn)換成RGB 格式寞焙,渲染到球體上
3.通過(guò)手勢(shì)的變換储狭,改變球體模型視圖矩陣值
4.如果是VR模式,則通過(guò)角度傳感器獲取用戶的行為,調(diào)整視圖矩陣捣郊。
實(shí)現(xiàn)了那些功能
- 支持普通視頻播放
- 支持全景視頻播放
- 支持VR 雙屏顯示模式
- 支持快進(jìn)辽狈,快退
- 支持播放,暫停
- 支持暫停廣告功能
核心代碼講解
如果你想要和我一樣,能夠從零開(kāi)始把代碼敲出來(lái)模她,請(qǐng)確保自己有OpenGL ES 2.0 的基礎(chǔ)知識(shí) 和 GLSL 的簡(jiǎn)單基本知識(shí)稻艰,如果你不具備這方面的知識(shí),沒(méi)關(guān)系侈净,我已經(jīng)寫(xiě)好了OpenGL學(xué)習(xí)教程和GLSL教程,請(qǐng)移步開(kāi)始學(xué)習(xí)尊勿。下面開(kāi)始我們的內(nèi)容講解.
- 視頻采集
<p>工程中的兩個(gè)文件 XJVRPlayerViewController.h和XJVRPlayerController.m主要負(fù)責(zé)視頻數(shù)據(jù)采集,界面布局在XJVRPlayerViewController中可以更改,主要使用AVFoundation框架這部分內(nèi)容今天咱不講解,后面我會(huì)寫(xiě)關(guān)于視頻采集的教程</p>
模型創(chuàng)建
a.全景播放器生成球體的頂點(diǎn)坐標(biāo)和紋理坐標(biāo)
b.普通播放器生成長(zhǎng)方形的頂點(diǎn)坐標(biāo)和紋理坐標(biāo)
兩個(gè)生成函數(shù)在OSShere.h中-
將數(shù)據(jù)加載到GPU中去
// 加載頂點(diǎn)索引數(shù)據(jù) glGenBuffers(1, &_indexBuffer); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _indexBuffer); glBufferData(GL_ELEMENT_ARRAY_BUFFER, _numIndices*sizeof(GLushort), _indices, GL_STATIC_DRAW); // 加載頂點(diǎn)坐標(biāo) glGenBuffers(1, &_vertexBuffer); glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer); glBufferData(GL_ARRAY_BUFFER, numVertices*strideNum*sizeof(GLfloat), _vertices, GL_STATIC_DRAW); glEnableVertexAttribArray(GLKVertexAttribPosition); glVertexAttribPointer(GLKVertexAttribPosition, strideNum, GL_FLOAT, GL_FALSE, strideNum*sizeof(GLfloat), NULL); //加載紋理坐標(biāo) glGenBuffers(1, &_textureCoordBuffer); glBindBuffer(GL_ARRAY_BUFFER, _textureCoordBuffer); glBufferData(GL_ARRAY_BUFFER, sizeof(GLfloat)*2*numVertices, _texCoords, GL_DYNAMIC_DRAW); glEnableVertexAttribArray(GLKVertexAttribTexCoord0); glVertexAttribPointer(GLKVertexAttribTexCoord0, 2, GL_FLOAT, GL_FALSE, 2*sizeof(GLfloat), NULL);
以上函數(shù)的具體用法,在之前的教程中都講過(guò),這里就不贅述了畜侦。
-
著色器程序
我把著色器分為兩種類型元扔,一種是渲染全景視頻的,一種是渲染普通視頻的旋膳,兩個(gè)沒(méi)有多大區(qū)別澎语,只是在全景著色器中添加了一個(gè)視圖轉(zhuǎn)換矩陣 (全景著色器:ShadePanorama,普通著色器:ShaderNormal)
下面給出的是全景的著色器的代碼:
a.頂點(diǎn)著色器attribute vec4 position; // 頂點(diǎn)坐標(biāo)屬性 attribute vec2 texCoord0;// 紋理坐標(biāo) varying vec2 texCoordVarying;// 片段著色器輸入變量,負(fù)責(zé)獲取紋理坐標(biāo)的值 uniform mat4 modelViewProjectionMatrix;//視圖變換矩陣 void main (){ texCoordVarying = texCoord0; gl_Position = modelViewProjectionMatrix*position; }
b.片段著色器
precision mediump float;//設(shè)置float精度
varying vec2 texCoordVarying;
uniform sampler2D sam2DY; // 紋理采樣器Y
uniform sampler2D sam2DUV;// 紋理采樣器UV
void main(){
mediump vec3 yuv;
lowp vec3 rob;
// YUV 轉(zhuǎn)RGB 的轉(zhuǎn)換矩陣
mediump mat3 convert = mat3(1.164, 1.164, 1.164,
0.0, -0.213, 2.112,
1.793, -0.533, 0.0);
yuv.x = texture2D(sam2DY,texCoordVarying).r - (16.0/255.0);
yuv.yz = texture2D(sam2DUV,texCoordVarying).rg - vec2(0.5, 0.5);
rgb = convert*yuv;
gl_FragColor = vec4(rgb,1);
}
如果想要了解更多關(guān)于著色器語(yǔ)言的知識(shí),請(qǐng)猛戳我
-
創(chuàng)建著色器程序
創(chuàng)建著色器程序的目的是編譯剛才我們編寫(xiě)好的著色器源代碼验懊,以及將著色器的變量和我們的應(yīng)用程序代碼相關(guān)聯(lián)/** * 創(chuàng)建編譯shader程序 * * @param vshName 頂點(diǎn)著色器文件名稱 * @param fshName 片段著色器文件名稱 */ -(void)createShaderProgramVertexShaderName: (NSString*)vshName FragmentShaderName: (NSString*)fshName{ self.shaderManager = [[OSShaderManager alloc]init]; // 編譯連個(gè)shader 文件 GLuint vertexShader,fragmentShader; NSURL *vertexShaderPath = [[NSBundle mainBundle]URLForResource:vshName withExtension:@"vsh"]; NSURL *fragmentShaderPath = [[NSBundle mainBundle]URLForResource:fshName withExtension:@"fsh"]; if (![self.shaderManager compileShader:&vertexShader type:GL_VERTEX_SHADER URL:vertexShaderPath]||! [self.shaderManager compileShader:&fragmentShader type:GL_FRAGMENT_SHADER URL:fragmentShaderPath]){ return ; } // 注意獲取綁定屬性要在連接程序之前 location 隨便你寫(xiě),如果你隨便寫(xiě)請(qǐng)記住他擅羞,后面要用到 [self.shaderManager bindAttribLocation:GLKVertexAttribPosition andAttribName:"position"]; [self.shaderManager bindAttribLocation:GLKVertexAttribTexCoord0 andAttribName:"texCoord0"]; // 將編譯好的兩個(gè)對(duì)象和著色器程序進(jìn)行連接 if(![self.shaderManager linkProgram]){ [self.shaderManager deleteShader:&vertexShader]; [self.shaderManager deleteShader:&fragmentShader]; } _textureBufferY = [self.shaderManager getUniformLocation:"sam2DY"]; _textureBufferUV = [self.shaderManager getUniformLocation:"sam2DUV"]; _modelViewProjectionMatrixIndex = [self.shaderManager getUniformLocation:"modelViewProjectionMatrix"]; [self.shaderManager detachAndDeleteShader:&vertexShader]; [self.shaderManager detachAndDeleteShader:&fragmentShader]; // 啟用著色器 [self.shaderManager useProgram]; }
// 上面的OSShaderManager 這個(gè)類,我把著色器程序編譯鏈接的一些方法簡(jiǎn)單的封裝了一下,具體的方向看下面
/**
* 編譯shader程序
* @param shader shader名稱
* @param type shader 類型
* @param URL shader 本地路徑
* @return 是否編譯成功
*/
- (BOOL)compileShader:(GLuint *)shader type:(GLenum)type URL:(NSURL *)URL;
/**
* 連接程序
* @return 連接程序是否成功
*/
- (BOOL)linkProgram;
/**
* 驗(yàn)證程序是否成功
* @param prog 程序標(biāo)示
* @return 返回是否成功標(biāo)志
*/
- (BOOL)validateProgram;
/**
* 綁定著色器的屬性
* @param index 屬性在shader 程序的索引位置
* @param name 屬性名稱
*/
- (void)bindAttribLocation:(GLuint)index andAttribName: (GLchar*)name;
/**
* 刪除shader
*/
- (void)deleteShader:(GLuint*)shader;
/**
* 獲取屬性值索引位置
* @param name 屬性名稱
* @return 返回索引位置
*/
- (GLint)getUniformLocation:(const GLchar*) name;
/**
* 釋放, 刪除shader
* @param shader 著色器名稱
*/
-(void)detachAndDeleteShader:(GLuint*)shader;
/**
* 使用程序
*/
-(void)useProgram;
方法的具體實(shí)現(xiàn)請(qǐng)閱讀工程文件
-
紋理采樣器指向
glUniform1i(_textureBufferY, 0); // 0 代表GL_TEXTURE0 glUniform1i(_textureBufferUV, 1); // 1 代表GL_TEXTURE1
在這里我有必要提醒你义图,這兩個(gè)方法减俏,一定要放在著色器程序鏈接成功之后,不然你調(diào)用這個(gè)兩個(gè)方法碱工,沒(méi)有效果娃承。
-
如何將YUV 數(shù)據(jù)分離,并且加載到兩個(gè)著色器中去, 這里我們又要用到之前我們使用過(guò)的框架了CoreVideo. 干涉么的呢怕篷,專門(mén)處理我們的像素?cái)?shù)據(jù)的历筝。我們從視頻采集到的視頻是CVPixelBufferRef 類型的
下面我們先看一下我們像素?cái)?shù)據(jù)的格式<CVPixelBuffer 0x7fa27962c9c0 width=2048 height=1024 pixelFormat=420v iosurface=0x0 planes=2> <Plane 0 width=2048 height=1024 bytesPerRow=2048> <Plane 1 width=1024 height=512 bytesPerRow=2048> <attributes=<CFBasicHash 0x7fa279623910 [0x10296ba40]>{type = immutable dict, count = 4, entries => 1 : <CFString 0x102d183b8 [0x10296ba40]>{contents = "PixelFormatType"} = <CFArray 0x7fa27c414bc0 [0x10296ba40]>{type = mutable-small, count = 1, values = ( 0 : <CFNumber 0xb000000343230763 [0x10296ba40]>{value = +875704438, type = kCFNumberSInt64Type} )} 2 : <CFString 0x102d17e78 [0x10296ba40]>{contents = "Height"} = <CFNumber 0xb000000000004002 [0x10296ba40]>{value = +1024, type = kCFNumberSInt32Type} 5 : <CFString 0x102d17d38 [0x10296ba40]>{contents = "PropagatedAttachments"} = <CFBasicHash 0x7fa27c51c590 [0x10296ba40]>{type = mutable dict, count = 4, entries => 0 : <CFString 0x102d18058 [0x10296ba40]>{contents = "CVImageBufferYCbCrMatrix"} = <CFString 0x102d18098 [0x10296ba40]>{contents = "ITU_R_601_4"} 1 : <CFString 0x102d181b8 [0x10296ba40]>{contents = "CVImageBufferTransferFunction"} = <CFString 0x102d18078 [0x10296ba40]>{contents = "ITU_R_709_2"} 2 : <CFString 0x106eadc88 [0x10296ba40]>{contents = "ColorInfoGuessedBy"} = <CFString 0x106eadca8 [0x10296ba40]>{contents = "VideoToolbox"} 5 : <CFString 0x102d18138 [0x10296ba40]>{contents = "CVImageBufferColorPrimaries"} = <CFString 0x102d18178 [0x10296ba40]>{contents = "SMPTE_C"} } 6 : <CFString 0x102d17e58 [0x10296ba40]>{contents = "Width"} = <CFNumber 0xb000000000008002 [0x10296ba40]>{value = +2048, type = kCFNumberSInt32Type} } propagatedAttachments=<CFBasicHash 0x7fa27962caa0 [0x10296ba40]>{type = mutable dict, count = 10, entries => 0 : <CFString 0x106eadc88 [0x10296ba40]>{contents = "ColorInfoGuessedBy"} = <CFString 0x106eadca8 [0x10296ba40]>{contents = "VideoToolbox"} 1 : <CFString 0x102d18058 [0x10296ba40]>{contents = "CVImageBufferYCbCrMatrix"} = <CFString 0x102d18098 [0x10296ba40]>{contents = "ITU_R_601_4"} 2 : <CFString 0x102d17ed8 [0x10296ba40]>{contents = "CVFieldCount"} = <CFNumber 0xb000000000000012 [0x10296ba40]>{value = +1, type = kCFNumberSInt32Type} 3 : <CFString 0x102d17f98 [0x10296ba40]>{contents = "CVPixelAspectRatio"} = <CFBasicHash 0x7fa279728c10 [0x10296ba40]>{type = immutable dict, count = 2, entries => 1 : <CFString 0x102d17fb8 [0x10296ba40]>{contents = "HorizontalSpacing"} = <CFNumber 0xb000000000000012 [0x10296ba40]>{value = +1, type = kCFNumberSInt32Type} 2 : <CFString 0x102d17fd8 [0x10296ba40]>{contents = "VerticalSpacing"} = <CFNumber 0xb000000000000012 [0x10296ba40]>{value = +1, type = kCFNumberSInt32Type} } 4 : <CFString 0x102d17d78 [0x10296ba40]>{contents = "QTMovieTime"} = <CFBasicHash 0x7fa27c51db40 [0x10296ba40]>{type = immutable dict, count = 2, entries => 0 : <CFString 0x102d17d98 [0x10296ba40]>{contents = "TimeValue"} = <CFNumber 0xb000000000000003 [0x10296ba40]>{value = +0, type = kCFNumberSInt64Type} 1 : <CFString 0x102d17db8 [0x10296ba40]>{contents = "TimeScale"} = <CFNumber 0xb000000000075302 [0x10296ba40]>{value = +30000, type = kCFNumberSInt32Type} } 5 : <CFString 0x102d18138 [0x10296ba40]>{contents = "CVImageBufferColorPrimaries"} = <CFString 0x102d18178 [0x10296ba40]>{contents = "SMPTE_C"} 8 : <CFString 0x102d181b8 [0x10296ba40]>{contents = "CVImageBufferTransferFunction"} = <CFString 0x102d18078 [0x10296ba40]>{contents = "ITU_R_709_2"} 9 : <CFString 0x102d18318 [0x10296ba40]>{contents = "CVImageBufferChromaSubsampling"} = <CFString 0x102d18278 [0x10296ba40]>{contents = "TopLeft"} 10 : <CFString 0x102d18218 [0x10296ba40]>{contents = "CVImageBufferChromaLocationBottomField"} = <CFString 0x102d18338 [0x10296ba40]>{contents = "4:2:0"} 12 : <CFString 0x102d181f8 [0x10296ba40]>{contents = "CVImageBufferChromaLocationTopField"} = <CFString 0x102d18338 [0x10296ba40]>{contents = "4:2:0"} } nonPropagatedAttachments=<CFBasicHash 0x7fa27962ca60 [0x10296ba40]>{type = mutable dict, count = 0, entries => } >
我們從上面的日志輸出找到了下面的東西
<CVPixelBuffer 0x7fa27962c9c0 width=2048 height=1024 pixelFormat=420v iosurface=0x0 planes=2>
<Plane 0 width=2048 height=1024 bytesPerRow=2048>
<Plane 1 width=1024 height=512 bytesPerRow=2048>
我們能得到的信息是:
像素格式: 420v
數(shù)據(jù)通道: 2 個(gè)
通道1: width=2048 height=1024
通道2: width=1024 height=512
從上面信息可以得出我們數(shù)據(jù)的排列方式為YY....YY....UV.....UV,
2048\1024 個(gè)Y 數(shù)據(jù),1024\512 從 bytesPerRow 可以看出每個(gè)Y廊谓、U梳猪、V 各占一個(gè)字節(jié).
接下來(lái)就是如何將數(shù)據(jù)加載到我們的紋理緩沖區(qū)去了
CVReturn CVOpenGLESTextureCacheCreateTextureFromImage(
CFAllocatorRef CV_NULLABLE allocator,
CVOpenGLESTextureCacheRef CV_NONNULL textureCache,
CVImageBufferRef CV_NONNULL sourceImage,
CFDictionaryRef CV_NULLABLE textureAttributes,
GLenum target,
GLint internalFormat,
GLsizei width,
GLsizei height,
GLenum format,
GLenum type,
size_t planeIndex,
CV_RETURNS_RETAINED_PARAMETER CVOpenGLESTextureRef CV_NULLABLE * CV_NONNULL textureOut )
這個(gè)函數(shù)作用是: 通過(guò)CVImageBufferRef 創(chuàng)建一個(gè)紋理對(duì)象
allocator : 寫(xiě)默認(rèn)值就可以了 kCFAllocatorDefault
textureCache:我們需要手動(dòng)創(chuàng)建一個(gè)紋理緩沖對(duì)象,
sourceImage:傳我們的CVImageBufferRef 數(shù)據(jù)
textureAttributes:紋理屬性,可以為NULL
target:紋理的類型(GL_TEXTURE_2D 和GL_RENDERBUFFER)
internalFormat:數(shù)據(jù)格式,就是這個(gè)數(shù)據(jù)步伐的意思
width:紋理的高度
height : 紋理的長(zhǎng)度
format: 像素?cái)?shù)據(jù)的格式
type: 數(shù)據(jù)類型
planeIndex: 通道索引
接下來(lái)看我們的代碼:
// 啟用紋理緩沖區(qū)0
glActiveTexture(GL_TEXTURE0);
err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
_videoTextureCache,
pixelBuffer,
NULL,
GL_TEXTURE_2D,
GL_RED_EXT,
width,
height,
GL_RED_EXT,
GL_UNSIGNED_BYTE,
0,
&_lumaTexture);
if (err) {
NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
}
glBindTexture(CVOpenGLESTextureGetTarget(_lumaTexture), CVOpenGLESTextureGetName(_lumaTexture));
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
// UV-plane.
// 啟用紋理緩沖區(qū)1
glActiveTexture(GL_TEXTURE1);
err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
_videoTextureCache,
pixelBuffer,
NULL,
GL_TEXTURE_2D,
GL_RG_EXT,
width /2,
height /2,
GL_RG_EXT,
GL_UNSIGNED_BYTE,
1,
&_chromaTexture);
if (err) {
NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
}
glBindTexture(CVOpenGLESTextureGetTarget(_chromaTexture), CVOpenGLESTextureGetName(_chromaTexture));
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
GL_RED_EXT 代表 1位數(shù)據(jù) GL_RG_EXT 代表2位數(shù)據(jù) 蒸痹。UV 就是兩位數(shù)據(jù) 所以我們選擇GL_RG_EXT舔示。
剛才說(shuō)了,參數(shù)中需要一個(gè)紋理緩沖TextureCacha,接下來(lái)我們就自己創(chuàng)建一個(gè).
CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, self.eagContext, NULL, &_videoTextureCache);
以上基本的工作都做完了电抚,接下來(lái)惕稻,我們就只剩下顯示了
-
渲染繪制
// 清除顏色緩沖區(qū) glClearColor(0, 0, 0, 1); glClear(GL_COLOR_BUFFER_BIT); if (_isVR){ // 渲染雙屏 glViewport(0, 0, self.view.bounds.size.width, self.view.bounds.size.height*2); glDrawElements(GL_TRIANGLES, _numIndices, GL_UNSIGNED_SHORT, 0); glViewport(self.view.bounds.size.width, 0, self.view.bounds.size.width, self.view.bounds.size.height*2); glDrawElements(GL_TRIANGLES, _numIndices, GL_UNSIGNED_SHORT, 0); }else{ // 渲染單屏 glViewport(0, 0, self.view.bounds.size.width*2, self.view.bounds.size.height*2); glDrawElements(GL_TRIANGLES, _numIndices, GL_UNSIGNED_SHORT, 0); }
到這里,視頻已經(jīng)可以顯示了。
-
視圖矩陣初始化
-(void)initModelViewProjectMatrix{ // 創(chuàng)建投影矩陣 float aspect = fabs(self.view.bounds.size.width / self.view.bounds.size.height); _projectionMatrix = GLKMatrix4MakePerspective(GLKMathDegreesToRadians(OSVIEW_CORNER), aspect, 0.1f, 400.0f); _projectionMatrix = GLKMatrix4Rotate(_projectionMatrix, ES_PI, 1.0f, 0.0f, 0.0f); // 創(chuàng)建模型矩陣 _modelViewMatrix = GLKMatrix4Identity; float scale = OSSphereScale; _modelViewMatrix = GLKMatrix4Scale(_modelViewMatrix, scale, scale, scale); // 最終傳入到GLSL中去的矩陣 _modelViewProjectionMatrix = GLKMatrix4Multiply(_projectionMatrix, _modelViewMatrix); glUniformMatrix4fv(_modelViewProjectionMatrixIndex, 1, GL_FALSE, _modelViewProjectionMatrix.m); }
-
全景單屏模式
手勢(shì)操縱矩陣- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { if(self.isVR || self.vedioType == OSNormal ) return; UITouch *touch = [touches anyObject]; float distX = [touch locationInView:touch.view].x - [touch previousLocationInView:touch.view].x; float distY = [touch locationInView:touch.view].y - [touch previousLocationInView:touch.view].y; distX *= -0.005; distY *= -0.005; self.fingerRotationX += distY * OSVIEW_CORNER / 100; self.fingerRotationY -= distX * OSVIEW_CORNER / 100; _modelViewMatrix = GLKMatrix4Identity; float scale = OSSphereScale; _modelViewMatrix = GLKMatrix4Scale(_modelViewMatrix, scale, scale, scale); _modelViewMatrix = GLKMatrix4RotateX(_modelViewMatrix, self.fingerRotationX); _modelViewMatrix = GLKMatrix4RotateY(_modelViewMatrix, self.fingerRotationY); _modelViewProjectionMatrix = GLKMatrix4Multiply(_projectionMatrix, _modelViewMatrix); glUniformMatrix4fv(_modelViewProjectionMatrixIndex, 1, GL_FALSE, _modelViewProjectionMatrix.m); } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { if (self.isVR || self.vedioType == OSNormal) return; for (UITouch *touch in touches) { [self.currentTouches removeObject:touch]; } } - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event { for (UITouch *touch in touches) { [self.currentTouches removeObject:touch]; } }
-
全景 VR模式
使用角度傳感器-(void)startMotionManager{ self.motionManager = [[CMMotionManager alloc]init]; self.motionManager.deviceMotionUpdateInterval = 1.0 / 60.0; self.motionManager.gyroUpdateInterval = 1.0f / 60; self.motionManager.showsDeviceMovementDisplay = YES; [self.motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXArbitraryCorrectedZVertical]; self.referenceAttitude = nil; [self.motionManager startGyroUpdatesToQueue: [[NSOperationQueue alloc]init] withHandler:^(CMGyroData * _Nullable gyroData, NSError * _Nullable error) { if(self.isVR) { [self calculateModelViewProjectMatrixWithDeviceMotion:self.motionManager.deviceMotion]; } }]; self.referenceAttitude = self.motionManager.deviceMotion.attitude; } -(void)calculateModelViewProjectMatrixWithDeviceMotion:(CMDeviceMotion*)deviceMotion{ _modelViewMatrix = GLKMatrix4Identity; float scale = OSSphereScale; _modelViewMatrix = GLKMatrix4Scale(_modelViewMatrix, scale, scale, scale); if (deviceMotion != nil) { CMAttitude *attitude = deviceMotion.attitude; if (self.referenceAttitude != nil) { [attitude multiplyByInverseOfAttitude:self.referenceAttitude]; } else { self.referenceAttitude = deviceMotion.attitude; } float cRoll = attitude.roll; float cPitch = attitude.pitch; _modelViewMatrix = GLKMatrix4RotateX(_modelViewMatrix, -cRoll); _modelViewMatrix = GLKMatrix4RotateY(_modelViewMatrix, -cPitch*3); _modelViewProjectionMatrix = GLKMatrix4Multiply(_projectionMatrix, _modelViewMatrix); // 下邊這個(gè)方法必須在主線程中完成. dispatch_async(dispatch_get_main_queue(), ^{ glUniformMatrix4fv(_modelViewProjectionMatrixIndex, 1, GL_FALSE, _modelViewProjectionMatrix.m); }); }}
操作矩陣這里蝙叛,暫時(shí)不想講,后面我會(huì)專門(mén)來(lái)講矩陣變換和角度傳感器的使用,因?yàn)檫@兩個(gè)東西在游戲和VR,還是AR的世界,都太重要了俺祠。今天先說(shuō)的這里,給幾張展示圖欣賞一下借帘。
全景播放器-實(shí)現(xiàn)方案2
使用SceneKit 也可以實(shí)現(xiàn)全景播放器,需要了解的朋友請(qǐng)查看這里