Finding matrix given eigenvalues and eigenvectors.











up vote
2
down vote

favorite
1












a) Let $B$ be a 2x2 symmetric matrix and let $u$, $v$ be two eigenvectors of $B$ associated with the eigenvalues $w$ and $l$ respectively. Suppose $u$ = $$ left[
begin{array}{c}
1/sqrt2\
-1/sqrt2
end{array}
right] $$

and $v$ = $$ left[
begin{array}{c}
1/sqrt2\
1/sqrt2
end{array}
right] $$

with $w$= 1 and $l$=3 respectively, find matrix $B$.



b) Let $C$ be another symmetric matrix of order n with a characteristic polynomial $(w-w_1)(w-w_2)...(w-w_n)$ where $w_1≤w_2≤...≤w_n$. Prove that for any nonzero vector $x$ that belongs in $R^n$, $w_1≤x^TCx/x^Tx≤w_n$.



What I have done:
For part (a), converting $u$ and $v$ to orthogonal basis, I get $u_o$=$$ left[
begin{array}{c}
1\
-1
end{array}
right] $$
and $v_o$=$$ left[
begin{array}{c}
1\
1
end{array}
right] $$
. Letting $M$ = $$ left[
begin{array}{cc}
1&0\
0&3
end{array}
right] $$
and $S$ = $$ left[
begin{array}{cc}
1&1\
-1&1
end{array}
right] $$
, the matrix B is obtained by putting together $SMS^{-1}$ = $$ left[
begin{array}{cc}
2&1\
1&2
end{array}
right] $$
.



However, for part (b), while I can derive that $w_1,w_2,...,w_n$ are eigenvalues of C, I am unsure of how to proceed, hence may I get some help for this?










share|cite|improve this question






















  • Use $Av=lambda v$ to get system of linear equations and solve.
    – Yadati Kiran
    Nov 26 at 17:13















up vote
2
down vote

favorite
1












a) Let $B$ be a 2x2 symmetric matrix and let $u$, $v$ be two eigenvectors of $B$ associated with the eigenvalues $w$ and $l$ respectively. Suppose $u$ = $$ left[
begin{array}{c}
1/sqrt2\
-1/sqrt2
end{array}
right] $$

and $v$ = $$ left[
begin{array}{c}
1/sqrt2\
1/sqrt2
end{array}
right] $$

with $w$= 1 and $l$=3 respectively, find matrix $B$.



b) Let $C$ be another symmetric matrix of order n with a characteristic polynomial $(w-w_1)(w-w_2)...(w-w_n)$ where $w_1≤w_2≤...≤w_n$. Prove that for any nonzero vector $x$ that belongs in $R^n$, $w_1≤x^TCx/x^Tx≤w_n$.



What I have done:
For part (a), converting $u$ and $v$ to orthogonal basis, I get $u_o$=$$ left[
begin{array}{c}
1\
-1
end{array}
right] $$
and $v_o$=$$ left[
begin{array}{c}
1\
1
end{array}
right] $$
. Letting $M$ = $$ left[
begin{array}{cc}
1&0\
0&3
end{array}
right] $$
and $S$ = $$ left[
begin{array}{cc}
1&1\
-1&1
end{array}
right] $$
, the matrix B is obtained by putting together $SMS^{-1}$ = $$ left[
begin{array}{cc}
2&1\
1&2
end{array}
right] $$
.



However, for part (b), while I can derive that $w_1,w_2,...,w_n$ are eigenvalues of C, I am unsure of how to proceed, hence may I get some help for this?










share|cite|improve this question






















  • Use $Av=lambda v$ to get system of linear equations and solve.
    – Yadati Kiran
    Nov 26 at 17:13













up vote
2
down vote

favorite
1









up vote
2
down vote

favorite
1






1





a) Let $B$ be a 2x2 symmetric matrix and let $u$, $v$ be two eigenvectors of $B$ associated with the eigenvalues $w$ and $l$ respectively. Suppose $u$ = $$ left[
begin{array}{c}
1/sqrt2\
-1/sqrt2
end{array}
right] $$

and $v$ = $$ left[
begin{array}{c}
1/sqrt2\
1/sqrt2
end{array}
right] $$

with $w$= 1 and $l$=3 respectively, find matrix $B$.



b) Let $C$ be another symmetric matrix of order n with a characteristic polynomial $(w-w_1)(w-w_2)...(w-w_n)$ where $w_1≤w_2≤...≤w_n$. Prove that for any nonzero vector $x$ that belongs in $R^n$, $w_1≤x^TCx/x^Tx≤w_n$.



What I have done:
For part (a), converting $u$ and $v$ to orthogonal basis, I get $u_o$=$$ left[
begin{array}{c}
1\
-1
end{array}
right] $$
and $v_o$=$$ left[
begin{array}{c}
1\
1
end{array}
right] $$
. Letting $M$ = $$ left[
begin{array}{cc}
1&0\
0&3
end{array}
right] $$
and $S$ = $$ left[
begin{array}{cc}
1&1\
-1&1
end{array}
right] $$
, the matrix B is obtained by putting together $SMS^{-1}$ = $$ left[
begin{array}{cc}
2&1\
1&2
end{array}
right] $$
.



However, for part (b), while I can derive that $w_1,w_2,...,w_n$ are eigenvalues of C, I am unsure of how to proceed, hence may I get some help for this?










share|cite|improve this question













a) Let $B$ be a 2x2 symmetric matrix and let $u$, $v$ be two eigenvectors of $B$ associated with the eigenvalues $w$ and $l$ respectively. Suppose $u$ = $$ left[
begin{array}{c}
1/sqrt2\
-1/sqrt2
end{array}
right] $$

and $v$ = $$ left[
begin{array}{c}
1/sqrt2\
1/sqrt2
end{array}
right] $$

with $w$= 1 and $l$=3 respectively, find matrix $B$.



b) Let $C$ be another symmetric matrix of order n with a characteristic polynomial $(w-w_1)(w-w_2)...(w-w_n)$ where $w_1≤w_2≤...≤w_n$. Prove that for any nonzero vector $x$ that belongs in $R^n$, $w_1≤x^TCx/x^Tx≤w_n$.



What I have done:
For part (a), converting $u$ and $v$ to orthogonal basis, I get $u_o$=$$ left[
begin{array}{c}
1\
-1
end{array}
right] $$
and $v_o$=$$ left[
begin{array}{c}
1\
1
end{array}
right] $$
. Letting $M$ = $$ left[
begin{array}{cc}
1&0\
0&3
end{array}
right] $$
and $S$ = $$ left[
begin{array}{cc}
1&1\
-1&1
end{array}
right] $$
, the matrix B is obtained by putting together $SMS^{-1}$ = $$ left[
begin{array}{cc}
2&1\
1&2
end{array}
right] $$
.



However, for part (b), while I can derive that $w_1,w_2,...,w_n$ are eigenvalues of C, I am unsure of how to proceed, hence may I get some help for this?







linear-algebra matrices eigenvalues-eigenvectors linear-transformations






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Nov 26 at 17:06









Cheryl

755




755












  • Use $Av=lambda v$ to get system of linear equations and solve.
    – Yadati Kiran
    Nov 26 at 17:13


















  • Use $Av=lambda v$ to get system of linear equations and solve.
    – Yadati Kiran
    Nov 26 at 17:13
















Use $Av=lambda v$ to get system of linear equations and solve.
– Yadati Kiran
Nov 26 at 17:13




Use $Av=lambda v$ to get system of linear equations and solve.
– Yadati Kiran
Nov 26 at 17:13










1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted










Suppose that $lVert xrVert=1$. Then $x^Tx=1$. On the other hand, let $(v_1,ldots,v_n)$ an orthonormal basis of eigenvecors, such that the eigenvalu corresponding to $v_k$ is $w_k$. Then $x$ can be written as $alpha_1v_1+cdots+alpha_kv_k$. So, $Cx=alpha_1w_1v_1+cdots+alpha_nw_nv_n$ and so$$x^TCX=lvertalpha_1rvert^2w_1+cdots+lvertalpha_nrvert^2w_n.$$Since $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$, this number is between $w_1$ and $w_n$.



If $x$ is an arbitrary non-zero vector, you can apply the previus result to $frac x{lVert xrVert}$.






share|cite|improve this answer





















  • Sir Why in particular is $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$?
    – Yadati Kiran
    Nov 26 at 17:29










  • Because the basis is orthonormal and $lVert xrVert=1$.
    – José Carlos Santos
    Nov 26 at 17:30












  • Oh ok sir. got it !
    – Yadati Kiran
    Nov 26 at 17:31











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3014601%2ffinding-matrix-given-eigenvalues-and-eigenvectors%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
1
down vote



accepted










Suppose that $lVert xrVert=1$. Then $x^Tx=1$. On the other hand, let $(v_1,ldots,v_n)$ an orthonormal basis of eigenvecors, such that the eigenvalu corresponding to $v_k$ is $w_k$. Then $x$ can be written as $alpha_1v_1+cdots+alpha_kv_k$. So, $Cx=alpha_1w_1v_1+cdots+alpha_nw_nv_n$ and so$$x^TCX=lvertalpha_1rvert^2w_1+cdots+lvertalpha_nrvert^2w_n.$$Since $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$, this number is between $w_1$ and $w_n$.



If $x$ is an arbitrary non-zero vector, you can apply the previus result to $frac x{lVert xrVert}$.






share|cite|improve this answer





















  • Sir Why in particular is $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$?
    – Yadati Kiran
    Nov 26 at 17:29










  • Because the basis is orthonormal and $lVert xrVert=1$.
    – José Carlos Santos
    Nov 26 at 17:30












  • Oh ok sir. got it !
    – Yadati Kiran
    Nov 26 at 17:31















up vote
1
down vote



accepted










Suppose that $lVert xrVert=1$. Then $x^Tx=1$. On the other hand, let $(v_1,ldots,v_n)$ an orthonormal basis of eigenvecors, such that the eigenvalu corresponding to $v_k$ is $w_k$. Then $x$ can be written as $alpha_1v_1+cdots+alpha_kv_k$. So, $Cx=alpha_1w_1v_1+cdots+alpha_nw_nv_n$ and so$$x^TCX=lvertalpha_1rvert^2w_1+cdots+lvertalpha_nrvert^2w_n.$$Since $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$, this number is between $w_1$ and $w_n$.



If $x$ is an arbitrary non-zero vector, you can apply the previus result to $frac x{lVert xrVert}$.






share|cite|improve this answer





















  • Sir Why in particular is $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$?
    – Yadati Kiran
    Nov 26 at 17:29










  • Because the basis is orthonormal and $lVert xrVert=1$.
    – José Carlos Santos
    Nov 26 at 17:30












  • Oh ok sir. got it !
    – Yadati Kiran
    Nov 26 at 17:31













up vote
1
down vote



accepted







up vote
1
down vote



accepted






Suppose that $lVert xrVert=1$. Then $x^Tx=1$. On the other hand, let $(v_1,ldots,v_n)$ an orthonormal basis of eigenvecors, such that the eigenvalu corresponding to $v_k$ is $w_k$. Then $x$ can be written as $alpha_1v_1+cdots+alpha_kv_k$. So, $Cx=alpha_1w_1v_1+cdots+alpha_nw_nv_n$ and so$$x^TCX=lvertalpha_1rvert^2w_1+cdots+lvertalpha_nrvert^2w_n.$$Since $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$, this number is between $w_1$ and $w_n$.



If $x$ is an arbitrary non-zero vector, you can apply the previus result to $frac x{lVert xrVert}$.






share|cite|improve this answer












Suppose that $lVert xrVert=1$. Then $x^Tx=1$. On the other hand, let $(v_1,ldots,v_n)$ an orthonormal basis of eigenvecors, such that the eigenvalu corresponding to $v_k$ is $w_k$. Then $x$ can be written as $alpha_1v_1+cdots+alpha_kv_k$. So, $Cx=alpha_1w_1v_1+cdots+alpha_nw_nv_n$ and so$$x^TCX=lvertalpha_1rvert^2w_1+cdots+lvertalpha_nrvert^2w_n.$$Since $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$, this number is between $w_1$ and $w_n$.



If $x$ is an arbitrary non-zero vector, you can apply the previus result to $frac x{lVert xrVert}$.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Nov 26 at 17:20









José Carlos Santos

145k22115214




145k22115214












  • Sir Why in particular is $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$?
    – Yadati Kiran
    Nov 26 at 17:29










  • Because the basis is orthonormal and $lVert xrVert=1$.
    – José Carlos Santos
    Nov 26 at 17:30












  • Oh ok sir. got it !
    – Yadati Kiran
    Nov 26 at 17:31


















  • Sir Why in particular is $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$?
    – Yadati Kiran
    Nov 26 at 17:29










  • Because the basis is orthonormal and $lVert xrVert=1$.
    – José Carlos Santos
    Nov 26 at 17:30












  • Oh ok sir. got it !
    – Yadati Kiran
    Nov 26 at 17:31
















Sir Why in particular is $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$?
– Yadati Kiran
Nov 26 at 17:29




Sir Why in particular is $lvertalpha_1rvert^2+cdots+lvertalpha_nrvert^2=1$?
– Yadati Kiran
Nov 26 at 17:29












Because the basis is orthonormal and $lVert xrVert=1$.
– José Carlos Santos
Nov 26 at 17:30






Because the basis is orthonormal and $lVert xrVert=1$.
– José Carlos Santos
Nov 26 at 17:30














Oh ok sir. got it !
– Yadati Kiran
Nov 26 at 17:31




Oh ok sir. got it !
– Yadati Kiran
Nov 26 at 17:31


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3014601%2ffinding-matrix-given-eigenvalues-and-eigenvectors%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Berounka

Sphinx de Gizeh

Different font size/position of beamer's navigation symbols template's content depending on regular/plain...