# What are the relations between eigenvectors of $A$ and its adjoint $A^*$?

Everywhere I can read that a matrix and its adjoint have pretty much the same eigenvalues ( only complex conjugation is the difference between them). Now I was wondering whether such a relation also exists between the eigenvectors of both matrices. Do they have something in common?

Mathematics Asked by user66906 on November 12, 2021

Regarding the left and right eigenvectors, assuming that the matrix is diagonalizable: Suppose we have one eigenvalue lambda of A (NXN) and all of its eigenvectors. Find the (N-1) dimensional subspace spanned by all the eigenvectors except for the one associated with lambda. Now consider the vector perpendicular to this subspace. It will be the eigenvector of A-transpose associated with lambda. (This shows that it can be done. Not that this is an efficient way to do it computationally.)

Answered by John Finn on November 12, 2021

There is a very important orthogonality relation between eigenvectors of a matrix and its adjoint which is very useful in many applications (reduction of differential equations for example). Let $A$ be a matrix and $A^{ast}$ be its adjoint.

If $A x = lambda x$ and $A^{ast} y = mu y$ with $lambda neq overline{mu}$ (where $overline{cdot}$ is complex conjugate) then $x$ and $y$ are orthogonal. i.e. $(x, y)= 0$.

The proof is very easy. $$lambda (x, y) = (A x, y) = (x, A^{ast} y) = (x, mu y) = overline{mu} (x, y),$$ which implies that $(lambda - overline{mu})(x, y) = 0$. This implies the orthogonality of $x$ and $y$.

Answered by pitonist on November 12, 2021

You are essentially asking about the relation between right and left eigenvectors of the same matrix. Basically, even though they occur for the same eigenvalues, there is not much relation between them. While right eigenvectors live in the vector space you matrix acts upon, the left eigenvectors live in the dual space (the are linear forms on the space). Even if one assumes for simplicity that the matrix is diagonalisable, then the left eigenvectors for$~lambda$ are those linear forms that vanish on all the other eigenspaces than that for$~lambda$; clearly knowing just the eigenspace for$~lambda$ is not sufficient to known what that left-eigenspace is. The only positive thing I can think of is that for every left eigenvector there is some right-eigenvector for the same eigenvalue on which it does not vanish (obviously that vector is not unique).

Answered by Marc van Leeuwen on November 12, 2021

## Related Questions

### Let $A$ be a subset of $mathbb{R}$ such that $A$ is bounded below with inf $A = L > 0$.

2  Asked on July 26, 2020 by outlier

### Show that the perturbation of identity satisfies certain continuity and Lipschitz properties

0  Asked on July 24, 2020 by 0xbadf00d

### Ask a Question

Get help from others!