Android的OpenGL ES 2.0的屏幕坐标为世界坐标坐标、屏幕、世界、OpenGL

2023-09-12 23:03:04 作者:AK47持西施

我要建一个使用OpenGL ES 2.0的Andr​​oid应用程序,我已经碰壁。我想转换屏幕坐标(其中用户触摸)为世界坐标。我试着阅读并GLU.gluUnProject玩弄,但我现在不是做错了,或只是不明白。

这是我尝试......

 公共无效getWorldFromScreen(浮X,浮动Y){
    视口INT [] = {0,0,宽度,高度};

    浮动startY =((浮点)(高度) -  Y);
    浮动[]近= {0.0,0.0,0.0,0.0};
    浮动[]据= {0.0,0.0,0.0,0.0};

    浮动[] MV =新的浮动[16];
    Matrix.multiplyMM(MV,0,mViewMatrix,0,mModelMatrix,0);

    GLU.gluUnProject(X,startY,0,mv和0,mProjectionMatrix,0,视口,0,邻近,0);
    GLU.gluUnProject(X,startY,1,MV,0,mProjectionMatrix,0,视口,0,远,0);

    浮nearX =邻近[0] /邻近[3];
    浮NEARY =邻近[1] /邻近[3];
    浮nearZ =邻近[2] /邻近[3];

    浮farX =远[0] /远[3];
    浮farY =远[1] /远[3];
    浮farZ =远[2] /远[3];
}
 

我收到就显得不那么合适的数字,这是正确的方式来利用这种方法吗?不知是否对OpenGL ES 2.0的工作?我应该做模型矩阵​​的矩阵这些计算之前(Matr​​ix.setIdentityM(mModelMatix,0))?

作为后续行动,如果这是正确的,我怎么挑输出Z?基本上,我总是知道我想要什么距离世界坐标所在,但在GLU.gluUnProject在Z参数似乎是某种形式的远近平面之间的插值。难道仅仅是一个线性插值?

在此先感谢

解决方案

  / **
    *计算从画面变换坐标
    *系统,以世界坐标系坐标
    *为特定点,给定的相机位置。
    *
    * @参数VEC2触摸屏点触时,
      物理屏幕上的实际位置(EJ:160,240)
    *参数凸轮摄像头,X,Y对象的Z
      摄像头和屏幕宽度和screenHeight
      装置。
    *在WCS @返回的位置。
    * /
   公共VEC2 GetWorldCoords(VEC2触控,摄像头凸轮)
   {
       //初始化辅助变量。
       VEC2 worldPos =新VEC2();

       //屏幕高度和放大器;宽度(EJ:320×480)
       浮screenW = cam.GetScreenWidth();
       浮screenH = cam.GetScreenHeight();

       //辅助矩阵,向量
       //处理OGL。
       浮动[] invertedMatrix,transformMatrix,
           normalizedInPoint,在外点;
       invertedMatrix =新的浮动[16];
       transformMatrix =新的浮动[16];
       normalizedInPoint =新的浮动[4];
       外点=新的浮动[4];

       //反转y坐标,因为Android使用
       //左上和OGL左下角。
       INT oglTouchY =(int)的(screenH  -  touch.Y());

       / *变换画面角度夹
       空间OGL(-1,1)* /
       normalizedInPoint [0] =
        (浮动)((touch.X())* 2.0f / screenW  -  1.0);
       normalizedInPoint [1] =
        (浮动)((oglTouchY)* 2.0f / screenH  -  1.0);
       normalizedInPoint [2] =  -  1.0F;
       normalizedInPoint [3] = 1.0F;

       / *获取变换矩阵和
       然后逆。 * /
       打印(PROJ,getCurrentProjection(GL));
       打印(模型,getCurrentModelView(GL));
       Matrix.multiplyMM(
           transformMatrix,0,
           getCurrentProjection(GL),0,
           getCurrentModelView(GL),0);
       Matrix.invertM(invertedMatrix,0,
           transformMatrix,0);

       / *应用逆点
       在裁剪空间* /
       Matrix.multiplyMV(
           外点,0,
           invertedMatrix,0,
           normalizedInPoint,0);

       如果(外点[3] == 0.0)
       {
           //避免/ 0错误。
           Log.e(世界COORDS,错误!);
           返回worldPos;
       }

       //由第三部分除以找
       //出真正的位置。
       worldPos.Set(
           外点[0] /外点[3],
           外点[1] /外点[3]);

       返回worldPos;
   }
 
Android OpenGL ES 2.0 五 混合

算法这里进一步解释。

I'm building an Android application that uses OpenGL ES 2.0 and I've run into a wall. I'm trying to convert screen coordinates (where the user touches) to world coordinates. I've tried reading and playing around with GLU.gluUnProject but I'm either doing it wrong or just don't understand it.

This is my attempt....

public void getWorldFromScreen(float x, float y) {
    int viewport[] = { 0, 0, width , height};

    float startY = ((float) (height) - y);
    float[] near = { 0.0f, 0.0f, 0.0f, 0.0f };
    float[] far = { 0.0f, 0.0f, 0.0f, 0.0f };

    float[] mv = new float[16];
    Matrix.multiplyMM(mv, 0, mViewMatrix, 0, mModelMatrix, 0);

    GLU.gluUnProject(x, startY, 0, mv, 0, mProjectionMatrix, 0, viewport, 0, near, 0);
    GLU.gluUnProject(x, startY, 1, mv, 0, mProjectionMatrix, 0, viewport, 0, far, 0);

    float nearX = near[0] / near[3];
    float nearY = near[1] / near[3];
    float nearZ = near[2] / near[3];

    float farX = far[0] / far[3];
    float farY = far[1] / far[3];
    float farZ = far[2] / far[3];
}

The numbers I am getting don't seem right, is this the right way to utilize this method? Does it work for OpenGL ES 2.0? Should I make the Model Matrix an identity matrix before these calculations (Matrix.setIdentityM(mModelMatix, 0))?

As a follow up, if this is correct, how do I pick the output Z? Basically, I always know at what distance I want the world coordinates to be at, but the Z parameter in GLU.gluUnProject appears to be some kind of interpolation between the near and far plane. Is it just a linear interpolation?

Thanks in advance

解决方案

/**
    * Calculates the transform from screen coordinate
    * system to world coordinate system coordinates
    * for a specific point, given a camera position.
    *
    * @param touch Vec2 point of screen touch, the
      actual position on physical screen (ej: 160, 240)
    * @param cam camera object with x,y,z of the
      camera and screenWidth and screenHeight of
      the device.
    * @return position in WCS.
    */
   public Vec2 GetWorldCoords( Vec2 touch, Camera cam)
   {  
       // Initialize auxiliary variables.
       Vec2 worldPos = new Vec2();

       // SCREEN height & width (ej: 320 x 480)
       float screenW = cam.GetScreenWidth();
       float screenH = cam.GetScreenHeight();

       // Auxiliary matrix and vectors
       // to deal with ogl.
       float[] invertedMatrix, transformMatrix,
           normalizedInPoint, outPoint;
       invertedMatrix = new float[16];
       transformMatrix = new float[16];
       normalizedInPoint = new float[4];
       outPoint = new float[4];

       // Invert y coordinate, as android uses
       // top-left, and ogl bottom-left.
       int oglTouchY = (int) (screenH - touch.Y());

       /* Transform the screen point to clip
       space in ogl (-1,1) */       
       normalizedInPoint[0] =
        (float) ((touch.X()) * 2.0f / screenW - 1.0);
       normalizedInPoint[1] =
        (float) ((oglTouchY) * 2.0f / screenH - 1.0);
       normalizedInPoint[2] = - 1.0f;
       normalizedInPoint[3] = 1.0f;

       /* Obtain the transform matrix and
       then the inverse. */
       Print("Proj", getCurrentProjection(gl));
       Print("Model", getCurrentModelView(gl));
       Matrix.multiplyMM(
           transformMatrix, 0,
           getCurrentProjection(gl), 0,
           getCurrentModelView(gl), 0);
       Matrix.invertM(invertedMatrix, 0,
           transformMatrix, 0);       

       /* Apply the inverse to the point
       in clip space */
       Matrix.multiplyMV(
           outPoint, 0,
           invertedMatrix, 0,
           normalizedInPoint, 0);

       if (outPoint[3] == 0.0)
       {
           // Avoid /0 error.
           Log.e("World coords", "ERROR!");
           return worldPos;
       }

       // Divide by the 3rd component to find
       // out the real position.
       worldPos.Set(
           outPoint[0] / outPoint[3],
           outPoint[1] / outPoint[3]);

       return worldPos;       
   }

Algorithm is further explained here.

 
精彩推荐
图片推荐