iPhone旋转四元数的绝对坐标?坐标、iPhone、四元数

2023-09-08 10:48:48 作者:Su㈢、 說

我有一个陀螺仪的iPhone。

比方说,我有手机旋转Q A四元。

我要显示屏幕相对世界绝对坐标上的点。所以,我的电话的每一个旋转这点会仍然在真正的3D空间(在某种程度上增强现实的)。比方说,这是4点形成一个矩形。

所以,我创建了4个点的空间相对3D到我的手机屏幕和Q的转换应用于各个它。

我想这应该是很简单,但我的观点得到转化不是相对于世界坐标,但到坐标东西我不明白,可能是手机轴相关的?能否请你帮我这个?我需要创建的屏幕,这将是绝对的3D空间的旋转摄像头的虚点的投影新视图。

我的旋转结果似乎是正确的,只要我不旋转沿其正常轴电话(垂直于屏幕)。但在这个方向旋转导致完全错误的点翻译。

伪code包括在内。

motionManager.StartDeviceMotionUpdates

 四元Q; //四元数从CMAttitude以上,相对的画框写着:XArbitraryZVertical
        VAR齐= Q.Conjugate;

        变种VX =新的Vector3D(-1,0,2);
        变种VY =新的Vector3D(1,0,2);
        变种VZ =新的Vector3D(1,0,-2);
        变种VW =新的Vector3D(-1,0,-2);

        VAR VXN = Vector3d.Transform(VX,齐);
        VAR vYn = Vector3d.Transform(VX,齐);
        VAR vZn = Vector3d.Transform(VX,齐);
        VAR VWN = Vector3d.Transform(VW,齐);

        VAR convertPixels = 50;

        VXN = VXN * convertPixels;
        vYn = vYn * convertPixels;
        vZn = vZn * convertPixels;
        VWN = VWN * convertPixels;

        //屏幕投影

        X.Frame =新的RectangleF(新的PointF((浮子)(videoArea.Width / 2 + vXn.X),(浮子)(videoArea.Height / 2 + vXn.Z)),新的SizeF(10,10));
        Y.Frame =新的RectangleF(新的PointF((浮子)(videoArea.Width / 2 + vYn.X),(浮子)(videoArea.Height / 2 + vYn.Z)),新的SizeF(10,10));
        Z.Frame =新的RectangleF(新的PointF((浮子)(videoArea.Width / 2 + vZn.X),(浮子)(videoArea.Height / 2 + vZn.Z)),新的SizeF(10,10));
        W.Frame =新的RectangleF(新的PointF((浮子)(videoArea.Width / 2 + vWn.X),(浮子)(videoArea.Height / 2 + vWn.Z)),新的SizeF(10,10));
 

解决方案

最后,我用OpenGL矩阵做到了。我用从电话给出的旋转矩阵和所生成的投影矩阵的乘法。关键的功能是从iPhone到的OpenGL(右手到左手)旋转变换的坐标。我得到了新的旋转轴与-x和-y,创造了这一新中轴线和角度新的旋转矩阵。这矩阵给了我正确的结果。

I have an iPhone with gyroscope.

CAD中输入绝对坐标变成这样 看不到那个点 求教

Let's say I have a quaternion of the phone rotation Q.

I want to show points on screen relative of the world absolute coordinates. So with every rotation of my phone this points will be "still" in the real 3d space (sort of augmented reality). Let's say it is 4 points forming a rectangle.

So I have created 4 points in 3d space relative to my phone screen and apply the transformation of Q to each of it.

I thought it should be very simple but my points get transformed not relative to the world coordinates but to the something coordinates I don't understand, may be phone axis related?. Could you please help me with this? I need to create new view on the screen which will be projection from the virtual points in absolute 3d space to the rotated camera.

My rotation results seems right as long as I am not rotating the phone along its 'normal' axis (perpendicular to screen). But rotation on that direction results in completely wrong points translation.

Pseudocode included.

motionManager.StartDeviceMotionUpdates

        Quaternion Q;//quaternion read from CMAttitude above, relative frame: XArbitraryZVertical
        var Qi=Q.Conjugate;

        var vX = new Vector3d (-1, 0, 2);
        var vY = new Vector3d (1, 0, 2);
        var vZ = new Vector3d (1, 0, -2);
        var vW = new Vector3d (-1, 0, -2);

        var vXn=Vector3d.Transform(vX,Qi);
        var vYn=Vector3d.Transform(vX,Qi);
        var vZn=Vector3d.Transform(vX,Qi);
        var vWn=Vector3d.Transform(vW,Qi);

        var convertPixels = 50;

        vXn = vXn * convertPixels;
        vYn = vYn * convertPixels;
        vZn = vZn * convertPixels;
        vWn = vWn * convertPixels;

        //screen projection

        X.Frame = new RectangleF (new PointF ((float)(videoArea.Width / 2 + vXn.X), (float)(videoArea.Height / 2 + vXn.Z)), new SizeF (10, 10));
        Y.Frame = new RectangleF (new PointF ((float)(videoArea.Width / 2 + vYn.X), (float)(videoArea.Height / 2 + vYn.Z)), new SizeF (10, 10));
        Z.Frame = new RectangleF (new PointF ((float)(videoArea.Width / 2 + vZn.X), (float)(videoArea.Height / 2 + vZn.Z)), new SizeF (10, 10));
        W.Frame = new RectangleF (new PointF ((float)(videoArea.Width / 2 + vWn.X), (float)(videoArea.Height / 2 + vWn.Z)), new SizeF (10, 10));

解决方案

Finally I done it with OpenGL matrices. I use multiplication of the rotation matrix given from phone and the generated projection matrix. The key feature was transforming the coordinates of rotation from the iPhone to OpenGL (right handed to left-handed). I got the new axis of rotation with -x and -y, created the new rotation matrix from this new axis and angle. And that matrix gives me the right results.