|
|
[[_TOC_]]
|
|
|
|
|
|
# Introduction
|
|
|
The covariance matrix ( $`\Sigma`$ ) is a crucial input in our analysis. In our analysis, we estimated the covariance matric from several off-source samples.
|
|
|
We use the inverse of the covariance matrix to compute the weighted inner product between two vectors, such as inner product between two templates or between the template $`S(\alpha)`$ and on-source $`Y(\alpha)`$. So, the accurate computation of the inverse matrix is essential for getting robust results. One can check this accuracy using the following property; the inverse of a matrix is such that if it is multiplied by the original matrix, it results in the identity matrix.
|
|
|
```math
|
... | ... | @@ -5,12 +8,15 @@ We use the inverse of the covariance matrix to compute the weighted inner produc |
|
|
```
|
|
|
Thus, the diagonal elements of the above inner product must be equal to 1 (or nearly equal to 1 for numerical computation), and off-diagonal elements must be zero (or nearly equal to zero). The inner product should be invariant for the choice of whether the right-hand inverse matrix or left-hand inverse matrix.
|
|
|
|
|
|
# Simple numpy inverse
|
|
|
So far, we were used the simple [`numpy.linalg.inv()`](https://numpy.org/doc/stable/reference/generated/numpy.linalg.inv.html) function to calculate the inverse of a matrix. In this case, we found disagreement between the right-hand inverse matrix or the left-hand inverse matrix, and also the off-diagonal elements are not nearly equal to zero.
|
|
|
|
|
|
<img src="uploads/5c14e15cf27db9dde43ae57c9da231e0/inv.png" width="440" ><img src="uploads/5f028dfe024ae56916586e2488a3c602/inv_off_diag.png" width="440" >
|
|
|
|
|
|
In the above figures, the quantity 'inv' refers to the `numpy.linalg.inv()` function. In the right plot, the quantity 'k' refers to the index of the off-diagonal array; k>0 for diagonals above the main diagonal, and k<0 for diagonals below the main diagonal. Note that the off-diagonal elements are substantially larger than zero for the left-hand inverse matrix only, and also not consistent with the case of the right-hand inverse matrix.
|
|
|
|
|
|
|
|
|
# Moore-Penrose pseudo-inverse
|
|
|
To resolve this issue, we propose to use the Moore-Penrose pseudo-inverse method. This method calculates the generalized inverse of a matrix using its singular-value decomposition (SVD) and including all large singular values. We use the numpy inbuild function [`numpy.linalg.pinv()`](https://numpy.org/doc/stable/reference/generated/numpy.linalg.pinv.html). If $`\Sigma`$ is a $`n\times n`$ nonsingular matrix, then its inverse is given by
|
|
|
```math
|
|
|
\Sigma = U \: D \: V^T \ \ \text{or} \ \ \Sigma^{-1} = V \: D^{-1} \: U^T
|
... | ... | @@ -22,6 +28,7 @@ To resolve this issue, we propose to use the Moore-Penrose pseudo-inverse method |
|
|
<img src="uploads/4692d03ff868fcc538ab06803102a7c3/pinv.png" width="440" ><img src="uploads/8d64ddc718f1fe06d54b62ca6028d123/pinv_off_diag.png" width="440" >
|
|
|
In the above figures, the quantity 'pinv' refers to the `numpy.linalg.pinv` function. The figures indicate that the inner product using the right-hand inverse matrix is consistent with the left-hand inverse matrix, and also the off-diagonal elements are nearly equal to zero. Therefore, `pinv` more robust.
|
|
|
|
|
|
# Ill-conditioned covariance matrix
|
|
|
Now, we discuss that the covariance matrix is ill-conditioned (i.e., nearly singular). Let us focus on the eigenvalues of the covariance matrix.
|
|
|
<img src="uploads/b4604ec7877c560d6f7aa9232ffd4c08/eigenvalues1.png" width="440" >
|
|
|
|
... | ... | |