Finding matrix given eigenvalues and eigenvectors.
up vote
2
down vote
favorite
a) Let $B$ be a 2x2 symmetric matrix and let $u$, $v$ be two eigenvectors of $B$ associated with the eigenvalues $w$ and $l$ respectively. Suppose $u$ = $$ left[
begin{array}{c}
1/sqrt2\
-1/sqrt2
end{array}
right] $$
and $v$ = $$ left[
begin{array}{c}
1/sqrt2\
1/sqrt2
end{array}
right] $$
with $w$= 1 and $l$=3 respectively, find matrix $B$.
b) Let $C$ be another symmetric matrix of order n with a characteristic polynomial $(w-w_1)(w-w_2)...(w-w_n)$ where $w_1≤w_2≤...≤w_n$. Prove that for any nonzero vector $x$ that belongs in $R^n$, $w_1≤x^TCx/x^Tx≤w_n$.
What I have done:
For part (a), converting $u$ and $v$ to orthogonal basis, I get $u_o$=$$ left[
begin{array}{c}
1\
-1
end{array}
right] $$ and $v_o$=$$ left[
begin{array}{c}
1\
1
end{array}
right] $$. Letting $M$ = $$ left[
begin{array}{cc}
1&0\
0&3
end{array}
right] $$ and $S$ = $$ left[
begin{array}{cc}
1&1\
-1&1
end{array}
right] $$, the matrix B is obtained by putting together $SMS^{-1}$ = $$ left[
begin{array}{cc}
2&1\
1&2
end{array}
right] $$.
However, for part (b), while I can derive that $w_1,w_2,...,w_n$ are eigenvalues of C, I am unsure of how to proceed, hence may I get some help for this?
linear-algebra matrices eigenvalues-eigenvectors linear-transformations
add a comment |
up vote
2
down vote
favorite
a) Let $B$ be a 2x2 symmetric matrix and let $u$, $v$ be two eigenvectors of $B$ associated with the eigenvalues $w$ and $l$ respectively. Suppose $u$ = $$ left[
begin{array}{c}
1/sqrt2\
-1/sqrt2
end{array}
right] $$
and $v$ = $$ left[
begin{array}{c}
1/sqrt2\
1/sqrt2
end{array}
right] $$
with $w$= 1 and $l$=3 respectively, find matrix $B$.
b) Let $C$ be another symmetric matrix of order n with a characteristic polynomial $(w-w_1)(w-w_2)...(w-w_n)$ where $w_1≤w_2≤...≤w_n$. Prove that for any nonzero vector $x$ that belongs in $R^n$, $w_1≤x^TCx/x^Tx≤w_n$.
What I have done:
For part (a), converting $u$ and $v$ to orthogonal basis, I get $u_o$=$$ left[
begin{array}{c}
1\
-1
end{array}
right] $$ and $v_o$=$$ left[
begin{array}{c}
1\
1
end{array}
right] $$. Letting $M$ = $$ left[
begin{array}{cc}
1&0\
0&3
end{array}
right] $$ and $S$ = $$ left[
begin{array}{cc}
1&1\
-1&1
end{array}
right] $$, the matrix B is obtained by putting together $SMS^{-1}$ = $$ left[
begin{array}{cc}
2&1\
1&2
end{array}
right] $$.
However, for part (b), while I can derive that $w_1,w_2,...,w_n$ are eigenvalues of C, I am unsure of how to proceed, hence may I get some help for this?
linear-algebra matrices eigenvalues-eigenvectors linear-transformations
Use $Av=lambda v$ to get system of linear equations and solve.
– Yadati Kiran
Nov 26 at 17:13
add a comment |
up vote
2
down vote
favorite
up vote
2
down vote
favorite
a) Let $B$ be a 2x2 symmetric matrix and let $u$, $v$ be two eigenvectors of $B$ associated with the eigenvalues $w$ and $l$ respectively. Suppose $u$ = $$ left[
begin{array}{c}
1/sqrt2\
-1/sqrt2
end{array}
right] $$
and $v$ = $$ left[
begin{array}{c}
1/sqrt2\
1/sqrt2
end{array}
right] $$
with $w$= 1 and $l$=3 respectively, find matrix $B$.
b) Let $C$ be another symmetric matrix of order n with a characteristic polynomial $(w-w_1)(w-w_2)...(w-w_n)$ where $w_1≤w_2≤...≤w_n$. Prove that for any nonzero vector $x$ that belongs in $R^n$, $w_1≤x^TCx/x^Tx≤w_n$.
What I have done:
For part (a), converting $u$ and $v$ to orthogonal basis, I get $u_o$=$$ left[
begin{array}{c}
1\
-1
end{array}
right] $$ and $v_o$=$$ left[
begin{array}{c}
1\
1
end{array}
right] $$. Letting $M$ = $$ left[
begin{array}{cc}
1&0\
0&3
end{array}
right] $$ and $S$ = $$ left[
begin{array}{cc}
1&1\
-1&1
end{array}
right] $$, the matrix B is obtained by putting together $SMS^{-1}$ = $$ left[
begin{array}{cc}
2&1\
1&2
end{array}
right] $$.
However, for part (b), while I can derive that $w_1,w_2,...,w_n$ are eigenvalues of C, I am unsure of how to proceed, hence may I get some help for this?
linear-algebra matrices eigenvalues-eigenvectors linear-transformations
a) Let $B$ be a 2x2 symmetric matrix and let $u$, $v$ be two eigenvectors of $B$ associated with the eigenvalues $w$ and $l$ respectively. Suppose $u$ = $$ left[
begin{array}{c}
1/sqrt2\
-1/sqrt2
end{array}
right] $$
and $v$ = $$ left[
begin{array}{c}
1/sqrt2\
1/sqrt2
end{array}
right] $$
with $w$= 1 and $l$=3 respectively, find matrix $B$.
b) Let $C$ be another symmetric matrix of order n with a characteristic polynomial $(w-w_1)(w-w_2)...(w-w_n)$ where $w_1≤w_2≤...≤w_n$. Prove that for any nonzero vector $x$ that belongs in $R^n$, $w_1≤x^TCx/x^Tx≤w_n$.
What I have done:
For part (a), converting $u$ and $v$ to orthogonal basis, I get $u_o$=$$ left[
begin{array}{c}
1\
-1
end{array}
right] $$ and $v_o$=$$ left[
begin{array}{c}
1\
1
end{array}
right] $$. Letting $M$ = $$ left[
begin{array}{cc}
1&0\
0&3
end{array}
right] $$ and $S$ = $$ left[
begin{array}{cc}
1&1\
-1&1
end{array}
right] $$, the matrix B is obtained by putting together $SMS^{-1}$ = $$ left[
begin{array}{cc}
2&1\
1&2
end{array}
right] $$.
However, for part (b), while I can derive that $w_1,w_2,...,w_n$ are eigenvalues of C, I am unsure of how to proceed, hence may I get some help for this?
linear-algebra matrices eigenvalues-eigenvectors linear-transformations
linear-algebra matrices eigenvalues-eigenvectors linear-transformations
asked Nov 26 at 17:06
Cheryl
755
755
Use $Av=lambda v$ to get system of linear equations and solve.
– Yadati Kiran
Nov 26 at 17:13
add a comment |
Use $Av=lambda v$ to get system of linear equations and solve.
– Yadati Kiran
Nov 26 at 17:13
Use $Av=lambda v$ to get system of linear equations and solve.
– Yadati Kiran
Nov 26 at 17:13
Use $Av=lambda v$ to get system of linear equations and solve.
– Yadati Kiran
Nov 26 at 17:13
add a comment |
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
Suppose that $lVert xrVert=1$. Then $x^Tx=1$. On the other hand, let $(v_1,ldots,v_n)$ an orthonormal basis of eigenvecors, such that the eigenvalu corresponding to $v_k$ is $w_k$. Then $x$ can be written as $alpha_1v_1+cdots+alpha_kv_k$. So, $Cx=alpha_1w_1v_1+cdots+alpha_nw_nv_n$ and so$$x^TCX=lvertalpha_1rvert^2w_1+cdots+lvertalpha_nrvert^2w_n.$$Since $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$, this number is between $w_1$ and $w_n$.
If $x$ is an arbitrary non-zero vector, you can apply the previus result to $frac x{lVert xrVert}$.
Sir Why in particular is $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$?
– Yadati Kiran
Nov 26 at 17:29
Because the basis is orthonormal and $lVert xrVert=1$.
– José Carlos Santos
Nov 26 at 17:30
Oh ok sir. got it !
– Yadati Kiran
Nov 26 at 17:31
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
Suppose that $lVert xrVert=1$. Then $x^Tx=1$. On the other hand, let $(v_1,ldots,v_n)$ an orthonormal basis of eigenvecors, such that the eigenvalu corresponding to $v_k$ is $w_k$. Then $x$ can be written as $alpha_1v_1+cdots+alpha_kv_k$. So, $Cx=alpha_1w_1v_1+cdots+alpha_nw_nv_n$ and so$$x^TCX=lvertalpha_1rvert^2w_1+cdots+lvertalpha_nrvert^2w_n.$$Since $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$, this number is between $w_1$ and $w_n$.
If $x$ is an arbitrary non-zero vector, you can apply the previus result to $frac x{lVert xrVert}$.
Sir Why in particular is $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$?
– Yadati Kiran
Nov 26 at 17:29
Because the basis is orthonormal and $lVert xrVert=1$.
– José Carlos Santos
Nov 26 at 17:30
Oh ok sir. got it !
– Yadati Kiran
Nov 26 at 17:31
add a comment |
up vote
1
down vote
accepted
Suppose that $lVert xrVert=1$. Then $x^Tx=1$. On the other hand, let $(v_1,ldots,v_n)$ an orthonormal basis of eigenvecors, such that the eigenvalu corresponding to $v_k$ is $w_k$. Then $x$ can be written as $alpha_1v_1+cdots+alpha_kv_k$. So, $Cx=alpha_1w_1v_1+cdots+alpha_nw_nv_n$ and so$$x^TCX=lvertalpha_1rvert^2w_1+cdots+lvertalpha_nrvert^2w_n.$$Since $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$, this number is between $w_1$ and $w_n$.
If $x$ is an arbitrary non-zero vector, you can apply the previus result to $frac x{lVert xrVert}$.
Sir Why in particular is $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$?
– Yadati Kiran
Nov 26 at 17:29
Because the basis is orthonormal and $lVert xrVert=1$.
– José Carlos Santos
Nov 26 at 17:30
Oh ok sir. got it !
– Yadati Kiran
Nov 26 at 17:31
add a comment |
up vote
1
down vote
accepted
up vote
1
down vote
accepted
Suppose that $lVert xrVert=1$. Then $x^Tx=1$. On the other hand, let $(v_1,ldots,v_n)$ an orthonormal basis of eigenvecors, such that the eigenvalu corresponding to $v_k$ is $w_k$. Then $x$ can be written as $alpha_1v_1+cdots+alpha_kv_k$. So, $Cx=alpha_1w_1v_1+cdots+alpha_nw_nv_n$ and so$$x^TCX=lvertalpha_1rvert^2w_1+cdots+lvertalpha_nrvert^2w_n.$$Since $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$, this number is between $w_1$ and $w_n$.
If $x$ is an arbitrary non-zero vector, you can apply the previus result to $frac x{lVert xrVert}$.
Suppose that $lVert xrVert=1$. Then $x^Tx=1$. On the other hand, let $(v_1,ldots,v_n)$ an orthonormal basis of eigenvecors, such that the eigenvalu corresponding to $v_k$ is $w_k$. Then $x$ can be written as $alpha_1v_1+cdots+alpha_kv_k$. So, $Cx=alpha_1w_1v_1+cdots+alpha_nw_nv_n$ and so$$x^TCX=lvertalpha_1rvert^2w_1+cdots+lvertalpha_nrvert^2w_n.$$Since $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$, this number is between $w_1$ and $w_n$.
If $x$ is an arbitrary non-zero vector, you can apply the previus result to $frac x{lVert xrVert}$.
answered Nov 26 at 17:20
José Carlos Santos
145k22115214
145k22115214
Sir Why in particular is $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$?
– Yadati Kiran
Nov 26 at 17:29
Because the basis is orthonormal and $lVert xrVert=1$.
– José Carlos Santos
Nov 26 at 17:30
Oh ok sir. got it !
– Yadati Kiran
Nov 26 at 17:31
add a comment |
Sir Why in particular is $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$?
– Yadati Kiran
Nov 26 at 17:29
Because the basis is orthonormal and $lVert xrVert=1$.
– José Carlos Santos
Nov 26 at 17:30
Oh ok sir. got it !
– Yadati Kiran
Nov 26 at 17:31
Sir Why in particular is $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$?
– Yadati Kiran
Nov 26 at 17:29
Sir Why in particular is $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$?
– Yadati Kiran
Nov 26 at 17:29
Because the basis is orthonormal and $lVert xrVert=1$.
– José Carlos Santos
Nov 26 at 17:30
Because the basis is orthonormal and $lVert xrVert=1$.
– José Carlos Santos
Nov 26 at 17:30
Oh ok sir. got it !
– Yadati Kiran
Nov 26 at 17:31
Oh ok sir. got it !
– Yadati Kiran
Nov 26 at 17:31
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3014601%2ffinding-matrix-given-eigenvalues-and-eigenvectors%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Use $Av=lambda v$ to get system of linear equations and solve.
– Yadati Kiran
Nov 26 at 17:13