有时,用python和mathematica计算的特征向量具有相反的符号,有人知道为什么吗?

Nat*_*ate 5 python wolfram-mathematica eigenvector

I am trying to calculate the eigenvectors of many 3x3 matrices using python. My code in python is based on code from Mathematica which uses the Eigenvector[] function. I have tried using the eig() function from both numpy and scipy and the majority of the time the eigenvectors calculated in mathematica and python are identical. However, there are a few instances which the eigenvectors calculated in python are opposite in sign to those calculated in mathematica.

I have already tried using numpy and scipy to calculate the eigenvectors which both result in the same problem.

Python Code

    _, evec = eig(COVMATRIX)
Run Code Online (Sandbox Code Playgroud)

Mathematica Code

    evec = Eigenvectors[COVMATRIX]
Run Code Online (Sandbox Code Playgroud)

Note that in Mathematica the eigenvectors are along the rows whereas in python the eigenvectors are along the columns in the matrix.

Given

    COVMATRIX = [[2.9296875e-07, 0.0, 2.09676562e-10], 
                 [0.0, 2.9296875e-07, 1.5842226562e-09], 
                 [2.09676562e-10, 1.58422265e-09, 5.85710e-11]]
Run Code Online (Sandbox Code Playgroud)

The eigenvectors from python are

    [[-7.15807155e-04,  9.91354763e-01, -1.31206788e-01],
     [-5.40831983e-03, -1.31208740e-01, -9.91340011e-01],
     [9.99985119e-01,  2.21572611e-13, -5.45548378e-03]]
Run Code Online (Sandbox Code Playgroud)

The eigenvectors from mathematica are

    {{-0.131207, -0.99134, -0.00545548}, 
     {0.991355, -0.131209, 2.6987*10^-13}, 
     {-0.000715807, -0.00540832, 0.999985}}
Run Code Online (Sandbox Code Playgroud)

In this case the eigenvectors are the same between mathematica and python but...

Given

    COVMATRIX = [[2.9296875e-07, 0.0, 6.3368875e-10],
                 [0.0, 2.9296875e-07, 1.113615625e-09],
                 [6.3368875e-10, 1.113615625e-09, 5.0957159954e-11]]
Run Code Online (Sandbox Code Playgroud)

The eigenvectors from python are

    [[ 2.16330513e-03,  8.69137041e-01,  4.94566602e-01],  
     [ 3.80169349e-03, -4.94571334e-01,  8.69128726e-01], 
     [-9.99990434e-01,  1.11146084e-12,  4.37410133e-03]]
Run Code Online (Sandbox Code Playgroud)

But the eigenvectors from python are opposite in sign to the eigenvectors in mathematica which are below

    {{-0.494567, -0.869129, -0.0043741},  
     {0.869137, -0.494571, 1.08198*10^-12}, 
     {-0.00216331, -0.00380169, 0.99999}}
Run Code Online (Sandbox Code Playgroud)

vle*_*tre 5

如果v是特征向量,则根据定义:Av = lambda * v

因此,如果v是特征向量,则-v也是特征向量,因为:A *(-v)=-A * v = -lambda * v = lambda *(-v)

因此,这两种方法都是正确的。特征向量的目标是找到非共线性向量(如果要对角矩阵化,这将很有帮助)。因此,它们是否给您一个向量或相反的向量都没有关系。

  • 正如答案中所解释的,每个特征值都有一个共线性的特征向量,它们中的任何一个都不比任何其他“正确”的。如果您进一步的计算取决于您选择了其中一个特定的事实,那么处理它们的方式肯定有问题。您到底想做什么? (3认同)