Behind the scene of Camera.ScreenToWorldpoint() in Unity 2D

2016-11-19 09:00:12 Coding C++ Unity

For some reason I need to compute Camera.main.ScreenToWorldpoint manually. So, I need to know how it works internally. Since in 2D we have $$P_s = (P_v.x * w, P_v.y * h)$$, it is enough to understand how ViewportToWorldpoint works.

In theory (CG basics): $$P_v = M_{proj} * M_{view} * P_w$$, thus

$$P_w = M_{view}^{-1} * M_{proj}^{-1} * P_v $$

However, in Unity, $$P_v$$ is in the range [0, 1] rather than [-1, 1], which means it is actually

$$P_v = M_{extra} * M_{proj} * M_{view} * P_w$$

and thus

$$P_w = M_{view}^{-1} * M_{proj}^{-1} * M_{extra}^{-1} * P_v $$

Where

M_extra = [1, 0, 0, 1]   inv(M_extra) = [1, 0, 0, -0.5]
          [0, 1, 0, 1]                  [0, 1, 0, -0.5]
          [0, 0, 1, 0]                  [0, 0, 1,  0  ]
          [0, 0, 0, 2]                  [0, 0, 0,  0.5]

Note that we don’t care about Z since it is in 2D orthographic projection mode.

Now, in the language of Unity:

Initialize:

var magicMatrix = new Matrix4x4
{
    m00 = 1,
    m03 = -0.5f,
    m11 = 1,
    m13 = -0.5f,
    m22 = 1,
    m33 = 0.5f
};
_originalViewportToWorldMatrix = Camera.main.cameraToWorldMatrix*Camera.main.projectionMatrix.inverse*magicMatrix;

Compute:

private Vector3 OriginalScreenToWorld(float x, float y, float setZ)
{
    var vec = _originalViewportToWorldMatrix*new Vector4(x/_width, y/_height, 0, 1);
    return new Vector3(vec.x/vec.w, vec.y/vec.w, setZ);
}