taichi icon indicating copy to clipboard operation
taichi copied to clipboard

Different SVD routines provide different results

Open subodhkalia opened this issue 2 years ago • 0 comments

Describe the bug Comparing SVD results from taichi and another library.

To Reproduce Please find below the complete code to reproduce the bug

import taichi as ti
import numpy as np



##########################################################
@ti.func
def taichi_print_matrix(mat):

    print("[[", mat[0, 0], ",", mat[0, 1], ",", mat[0, 2], "]")
    print(" [", mat[1, 0], ",", mat[1, 1], ",", mat[1, 2], "]")
    print(" [", mat[2, 0], ",", mat[2, 1], ",", mat[2, 2], "]]")
    print("")
##########################################################

##########################################################
@ti.kernel
def perform_svd_in_taichi():

    A = ti.Matrix([[ 0.999688,   0.0249922, 0.0],
                   [ -0.0249922, 0.999688,  0.0],
                   [ 0.0,        0.0,       1.0]])



    print("A is :")
    taichi_print_matrix(A)

    U, S, V = ti.svd(A)

    print("U is :")
    taichi_print_matrix(U)

    print("S is :")
    taichi_print_matrix(S)

    print("V is :")
    taichi_print_matrix(V)

    print("Reconstructed A")
    recon_A = U @ S @ V.transpose()
    taichi_print_matrix(recon_A)

    print("U Ut is :")
    taichi_print_matrix(U @ U.transpose())

    print("V Vt is :")
    taichi_print_matrix(V @ V.transpose())

    print("Projecting S. Now the projection T is :")
    T = ti.Matrix([[ -0.0770307, 0, 0 ],
                   [ 0, -0.0932477, 0 ],
                   [ 0, 0, -0.0932477 ]])
    taichi_print_matrix(T)

    print("new_A is :")
    new_A = U @ T @ V.transpose()
    taichi_print_matrix(new_A)

##########################################################

##########################################################
def print_numpy_matrix(mat):

    print("[[", mat[0, 0], ",", mat[0, 1], ",", mat[0, 2], "]")
    print(" [", mat[1, 0], ",", mat[1, 1], ",", mat[1, 2], "]")
    print(" [", mat[2, 0], ",", mat[2, 1], ",", mat[2, 2], "]]")
    print("")
##########################################################



if __name__ == '__main__':
    
    ti.init(arch=ti.cpu)

    perform_svd_in_taichi()

    print("-------------------------------------------", flush=True)
    print("Now testing a different SVD method....")
    print("-------------------------------------------", flush=True)

    A = np.array([[ 0.999688,   0.0249922, 0.0],
                   [ -0.0249922, 0.999688,  0.0],
                   [ 0.0,        0.0,       1.0]])

    print("A is :")
    print_numpy_matrix(A)

    U = np.array([[ 0.12432, 0.992242, 0 ],
                  [ -0.992242, 0.12432, 0 ],
                  [ 0, 0, 1 ]])

    S = np.eye(3)

    
    V = np.array([[ 0.14908, 0.988825, 0 ],
                  [ -0.988825, 0.14908, 0 ],
                  [ 0, -0, 1 ]])

    print("U is :")
    print_numpy_matrix(U)

    print("S is :")
    print_numpy_matrix(S)

    print("V is :")
    print_numpy_matrix(V)

    print("Reconstructed A")
    recon_A = U @ S @ V.transpose()
    print_numpy_matrix(recon_A)

    print("U Ut is :")
    print_numpy_matrix(U @ U.transpose())

    print("V Vt is :")
    print_numpy_matrix(V @ V.transpose())

    print("Projecting S. Now the projection T is :")
    T = np.array([[ -0.0770307, 0, 0 ],
                   [ 0, -0.0932477, 0 ],
                   [ 0, 0, -0.0932477 ]])
    print_numpy_matrix(T)

    print("new_A is :")
    new_A = U @ T @ V.transpose()
    print_numpy_matrix(new_A)

Log/Screenshots

$ python SVD.py
[Taichi] version 1.7.0, llvm 15.0.1, commit 2fd24490, win, python 3.9.13
[Taichi] Starting on arch=x64

A is :
[[ 0.999688 , 0.024992 , 0.000000 ]
 [ -0.024992 , 0.999688 , 0.000000 ]
 [ 0.000000 , 0.000000 , 1.000000 ]]

U is :
[[ 0.999688 , 0.024992 , 0.000000 ]
 [ -0.024992 , 0.999688 , 0.000000 ]
 [ 0.000000 , 0.000000 , 1.000000 ]]

S is :
[[ 1.000001 , 0.000000 , 0.000000 ]
 [ 0.000000 , 1.000001 , 0.000000 ]
 [ 0.000000 , 0.000000 , 1.000000 ]]

V is :
[[ 1.000000 , 0.000000 , 0.000000 ]
 [ 0.000000 , 1.000000 , 0.000000 ]
 [ 0.000000 , 0.000000 , 1.000000 ]]

Reconstructed A
[[ 0.999688 , 0.024992 , 0.000000 ]
 [ -0.024992 , 0.999688 , 0.000000 ]
 [ 0.000000 , 0.000000 , 1.000000 ]]

U Ut is :
[[ 1.000000 , 0.000000 , 0.000000 ]
 [ 0.000000 , 1.000000 , 0.000000 ]
 [ 0.000000 , 0.000000 , 1.000000 ]]

V Vt is :
[[ 1.000000 , 0.000000 , 0.000000 ]
 [ 0.000000 , 1.000000 , 0.000000 ]
 [ 0.000000 , 0.000000 , 1.000000 ]]

Projecting S. Now the projection T is :
[[ -0.077031 , 0.000000 , 0.000000 ]
 [ 0.000000 , -0.093248 , 0.000000 ]
 [ 0.000000 , 0.000000 , -0.093248 ]]

new_A is :
[[ -0.077007 , -0.002330 , -0.000000 ]
 [ 0.001925 , -0.093219 , 0.000000 ]
 [ -0.000000 , -0.000000 , -0.093248 ]]

-------------------------------------------
Now testing a different SVD method....
-------------------------------------------
A is :
[[ 0.999688 , 0.0249922 , 0.0 ]
 [ -0.0249922 , 0.999688 , 0.0 ]
 [ 0.0 , 0.0 , 1.0 ]]

U is :
[[ 0.12432 , 0.992242 , 0.0 ]
 [ -0.992242 , 0.12432 , 0.0 ]
 [ 0.0 , 0.0 , 1.0 ]]

S is :
[[ 1.0 , 0.0 , 0.0 ]
 [ 0.0 , 1.0 , 0.0 ]
 [ 0.0 , 0.0 , 1.0 ]]

V is :
[[ 0.14908 , 0.988825 , 0.0 ]
 [ -0.988825 , 0.14908 , 0.0 ]
 [ 0.0 , 0.0 , 1.0 ]]

Reconstructed A
[[ 0.9996873212499999 , 0.02499271335999999 , 0.0 ]
 [ -0.02499271336 , 0.9996873212499999 , 0.0 ]
 [ 0.0 , 0.0 , 1.0 ]]

U Ut is :
[[ 0.999999648964 , 6.22650969894778e-18 , 0.0 ]
 [ 6.22650969894778e-18 , 0.9999996489639998 , 0.0 ]
 [ 0.0 , 0.0 , 1.0 ]]

V Vt is :
[[ 0.999999727025 , 1.0380168724566375e-17 , 0.0 ]
 [ 1.0380168724566375e-17 , 0.999999727025 , 0.0 ]
 [ 0.0 , 0.0 , 1.0 ]]

Projecting S. Now the projection T is :
[[ -0.0770307 , 0.0 , 0.0 ]
 [ 0.0 , -0.0932477 , 0.0 ]
 [ 0.0 , 0.0 , -0.0932477 ]]

new_A is :
[[ -0.09291798361936843 , -0.004324080588687271 , 0.0 ]
 [ -6.836134608785159e-05 , -0.07730717394336756 , 0.0 ]
 [ 0.0 , 0.0 , -0.0932477 ]]

Additional comments Consider the following matrix A [[ 0.999688 , 0.024992 , 0.000000 ] [ -0.024992 , 0.999688 , 0.000000 ] [ 0.000000 , 0.000000 , 1.000000 ]]

Performing an SVD using the taichi kernel above results in U, S, and V matrices mentioned above. The decomposition is valid since U.Ut = V.Vt = identity, and one can reconstruct A using U.S.Vt. Now, I perform some projection operations on S to get T (given above) and calculate new A as U.T.Vt. So far, so good.

Now, I use a different software to perform SVD and get the U, S, and V mentioned above. These U and V are not the same as from Taichi; however, S matches both cases. This is also a valid SVD operation since U.Ut = V.Vt = identity, and one can reconstruct A using U.S.Vt. I did the same operation on S to get the same T as with Taichi. Since S is the same, T is the same. Now I calculate new A as U.T.Vt, which is different from taichi.

How are the results here different based on which SVD is used? There are many implementations of SVD, and I am unsure which one is correct.

The differences are arising here because of T. If T was equal to S, then both SVDs will give the same results upon reconstruction using U.S.Vt.

subodhkalia avatar Dec 08 '23 02:12 subodhkalia