lippy9513 lippy9513 11-04-2024 Mathematics contestada It is known that a real matrix A has eigenvalues lambda _(1)=-2 and lambda _(2)=1+3i with corresponding eigenvectors v_(1)=[[1],[1],[0]] and v_(2)=[[0],[1],[1+i]]x(t) solves x^(')(t)=Ax(t) with initial conditions?