Cross product in higher dimensions
Suppose we have a vector $(a,b)$ in $2$-space. Then the vector $(-b,a)$ is orthogonal to the one we started with. Furthermore, the function $$(a,b) mapsto (-b,a)$$ is linear.
Suppose instead we have two vectors $x$ and $y$ in $3$-space. Then the cross product gives us a new vector $x times y$ that's orthogonal to the first two. Furthermore, cross products are bilinear.
Question. Can we do this in higher dimensions? For example, is there a way of turning three vectors in $4$-space into a fourth vector, orthogonal to the others, in a trilinear way?
linear-algebra vectors orthogonality multilinear-algebra cross-product
|
show 3 more comments
Suppose we have a vector $(a,b)$ in $2$-space. Then the vector $(-b,a)$ is orthogonal to the one we started with. Furthermore, the function $$(a,b) mapsto (-b,a)$$ is linear.
Suppose instead we have two vectors $x$ and $y$ in $3$-space. Then the cross product gives us a new vector $x times y$ that's orthogonal to the first two. Furthermore, cross products are bilinear.
Question. Can we do this in higher dimensions? For example, is there a way of turning three vectors in $4$-space into a fourth vector, orthogonal to the others, in a trilinear way?
linear-algebra vectors orthogonality multilinear-algebra cross-product
You might want to look at the Gram Schmitt method here :en.wikipedia.org/wiki/Gram%E2%80%93Schmidt_process
– Furrane
Jul 25 '17 at 10:06
2
This construction has nothing to do with cross product, just happen to coincide in dimension 3.
– Miguel
Jul 25 '17 at 10:54
1
A search for "generalized cross product" turns up a number of questions likely to be of interest, including Is the vector cross product only defined for 3D? and Generalized Cross Product. ;) (Not marking as a duplicate because you're better able to judge which question, if any, most nearly matches yours.)
– Andrew D. Hwang
Jul 25 '17 at 17:38
1
You might be interested in the notion of the orthogonal complement. It can give you the vector orthogonal to a given set of $n-1$ independent vectors in $n$-space, like you're asking for $n=4$. But it can also give you $k$ independent vectors orthogonal to a given set of $n-k$ independent vectors in $n$-space. So you can take two vectors in 4-space and find two vectors perpendicular to them and to each other.
– YawarRaza7349
Jul 25 '17 at 17:39
1
en.wikipedia.org/wiki/Seven-dimensional_cross_product
– user57159
Jul 26 '17 at 3:18
|
show 3 more comments
Suppose we have a vector $(a,b)$ in $2$-space. Then the vector $(-b,a)$ is orthogonal to the one we started with. Furthermore, the function $$(a,b) mapsto (-b,a)$$ is linear.
Suppose instead we have two vectors $x$ and $y$ in $3$-space. Then the cross product gives us a new vector $x times y$ that's orthogonal to the first two. Furthermore, cross products are bilinear.
Question. Can we do this in higher dimensions? For example, is there a way of turning three vectors in $4$-space into a fourth vector, orthogonal to the others, in a trilinear way?
linear-algebra vectors orthogonality multilinear-algebra cross-product
Suppose we have a vector $(a,b)$ in $2$-space. Then the vector $(-b,a)$ is orthogonal to the one we started with. Furthermore, the function $$(a,b) mapsto (-b,a)$$ is linear.
Suppose instead we have two vectors $x$ and $y$ in $3$-space. Then the cross product gives us a new vector $x times y$ that's orthogonal to the first two. Furthermore, cross products are bilinear.
Question. Can we do this in higher dimensions? For example, is there a way of turning three vectors in $4$-space into a fourth vector, orthogonal to the others, in a trilinear way?
linear-algebra vectors orthogonality multilinear-algebra cross-product
linear-algebra vectors orthogonality multilinear-algebra cross-product
edited Dec 2 at 16:44
José Carlos Santos
150k22120221
150k22120221
asked Jul 25 '17 at 10:04
goblin
36.6k1159189
36.6k1159189
You might want to look at the Gram Schmitt method here :en.wikipedia.org/wiki/Gram%E2%80%93Schmidt_process
– Furrane
Jul 25 '17 at 10:06
2
This construction has nothing to do with cross product, just happen to coincide in dimension 3.
– Miguel
Jul 25 '17 at 10:54
1
A search for "generalized cross product" turns up a number of questions likely to be of interest, including Is the vector cross product only defined for 3D? and Generalized Cross Product. ;) (Not marking as a duplicate because you're better able to judge which question, if any, most nearly matches yours.)
– Andrew D. Hwang
Jul 25 '17 at 17:38
1
You might be interested in the notion of the orthogonal complement. It can give you the vector orthogonal to a given set of $n-1$ independent vectors in $n$-space, like you're asking for $n=4$. But it can also give you $k$ independent vectors orthogonal to a given set of $n-k$ independent vectors in $n$-space. So you can take two vectors in 4-space and find two vectors perpendicular to them and to each other.
– YawarRaza7349
Jul 25 '17 at 17:39
1
en.wikipedia.org/wiki/Seven-dimensional_cross_product
– user57159
Jul 26 '17 at 3:18
|
show 3 more comments
You might want to look at the Gram Schmitt method here :en.wikipedia.org/wiki/Gram%E2%80%93Schmidt_process
– Furrane
Jul 25 '17 at 10:06
2
This construction has nothing to do with cross product, just happen to coincide in dimension 3.
– Miguel
Jul 25 '17 at 10:54
1
A search for "generalized cross product" turns up a number of questions likely to be of interest, including Is the vector cross product only defined for 3D? and Generalized Cross Product. ;) (Not marking as a duplicate because you're better able to judge which question, if any, most nearly matches yours.)
– Andrew D. Hwang
Jul 25 '17 at 17:38
1
You might be interested in the notion of the orthogonal complement. It can give you the vector orthogonal to a given set of $n-1$ independent vectors in $n$-space, like you're asking for $n=4$. But it can also give you $k$ independent vectors orthogonal to a given set of $n-k$ independent vectors in $n$-space. So you can take two vectors in 4-space and find two vectors perpendicular to them and to each other.
– YawarRaza7349
Jul 25 '17 at 17:39
1
en.wikipedia.org/wiki/Seven-dimensional_cross_product
– user57159
Jul 26 '17 at 3:18
You might want to look at the Gram Schmitt method here :en.wikipedia.org/wiki/Gram%E2%80%93Schmidt_process
– Furrane
Jul 25 '17 at 10:06
You might want to look at the Gram Schmitt method here :en.wikipedia.org/wiki/Gram%E2%80%93Schmidt_process
– Furrane
Jul 25 '17 at 10:06
2
2
This construction has nothing to do with cross product, just happen to coincide in dimension 3.
– Miguel
Jul 25 '17 at 10:54
This construction has nothing to do with cross product, just happen to coincide in dimension 3.
– Miguel
Jul 25 '17 at 10:54
1
1
A search for "generalized cross product" turns up a number of questions likely to be of interest, including Is the vector cross product only defined for 3D? and Generalized Cross Product. ;) (Not marking as a duplicate because you're better able to judge which question, if any, most nearly matches yours.)
– Andrew D. Hwang
Jul 25 '17 at 17:38
A search for "generalized cross product" turns up a number of questions likely to be of interest, including Is the vector cross product only defined for 3D? and Generalized Cross Product. ;) (Not marking as a duplicate because you're better able to judge which question, if any, most nearly matches yours.)
– Andrew D. Hwang
Jul 25 '17 at 17:38
1
1
You might be interested in the notion of the orthogonal complement. It can give you the vector orthogonal to a given set of $n-1$ independent vectors in $n$-space, like you're asking for $n=4$. But it can also give you $k$ independent vectors orthogonal to a given set of $n-k$ independent vectors in $n$-space. So you can take two vectors in 4-space and find two vectors perpendicular to them and to each other.
– YawarRaza7349
Jul 25 '17 at 17:39
You might be interested in the notion of the orthogonal complement. It can give you the vector orthogonal to a given set of $n-1$ independent vectors in $n$-space, like you're asking for $n=4$. But it can also give you $k$ independent vectors orthogonal to a given set of $n-k$ independent vectors in $n$-space. So you can take two vectors in 4-space and find two vectors perpendicular to them and to each other.
– YawarRaza7349
Jul 25 '17 at 17:39
1
1
en.wikipedia.org/wiki/Seven-dimensional_cross_product
– user57159
Jul 26 '17 at 3:18
en.wikipedia.org/wiki/Seven-dimensional_cross_product
– user57159
Jul 26 '17 at 3:18
|
show 3 more comments
4 Answers
4
active
oldest
votes
Yes. It's just like in dimension $3$: if your vectors are $(t_1,t_2,t_3,t_4)$, $(u_1,u_2,u_3,u_4)$, and $(v_1,v_2,v_3,v_4)$, compute the formal determinant:$$begin{vmatrix}t_1&t_2&t_3&t_4\u_1&u_2&u_3&u_4\v_1&v_2&v_3&v_4\e_1&e_2&e_3&e_4end{vmatrix}.$$ You then see $(e_1,e_2,e_3,e_4)$ as the canonical basis of $mathbb{R}^4$. Then the previous determinant is $(alpha_1,alpha_2,alpha_3,alpha_4)$ withbegin{align*}alpha_1&=t_4u_3v_2-t_3u_4v_2-t_4u_2v_3+t_2u_4v_3+t_3u_2v_4-t_2u_3v_4\alpha_2&=-t_4u_3v_1+t_3u_4v_1+t_4u_1v_3-t_1u_4v_3-t_3u_1v_4+t_1u_3v_4\alpha_3&=t_4u_2v_1-t_2u_4v_1-t_4u_1v_2+t_1u_4v_2+t_2u_1v_4-t_1u_2v_4\alpha_4&=-t_3u_2v_1+t_2u_3v_1+t_3u_1v_2-t_1u_3v_2-t_2u_1v_3+t_1u_2v_3end{align*}It's a vector orthogonal to the other three.
I followed a suggestion taken from the comments on this answer: to put the entries $e_1$, $e_2$, $e_3$, and $e_4$ at the bottom. It makes no difference in even dimension, but it produces the natural sign in even dimension.
Following another suggestion, I would like to add this remark:$$alpha_1=-begin{vmatrix}t_2&t_3&t_4\u_2&u_3&u_4\v_2&v_3&v_4end{vmatrix}text{, }alpha_2=begin{vmatrix}t_1&t_3&t_4\u_1&u_3&u_4\v_1&v_3&v_4end{vmatrix}text{, }alpha_3=-begin{vmatrix}t_1&t_2&t_4\u_1&u_2&u_4\v_1&v_2&v_4end{vmatrix}text{ and }alpha_4=begin{vmatrix}t_1&t_2&t_3\u_1&u_2&u_3\v_1&v_2&v_3\end{vmatrix}.$$
1
Very lucid and appreciable answer.thanks
– Arpit Yadav
Jul 25 '17 at 10:18
2
I'm pretty sure the row of basis vectors should be on the bottom to get the right-handedness correct; only in odd dimensions can this row be moved to the top without a change in sign whilst keeping the vectors in the same order.
– lastresort
Jul 25 '17 at 18:06
(+1) This is the method I've used in a few answers. The formula is easy to remember!
– robjohn♦
Jul 25 '17 at 22:50
@lastresort Nice remark. I shall edit my answer taking that into account.
– José Carlos Santos
Jul 25 '17 at 22:54
2
Is $$alpha_1 ={rm det} left| matrix{ t_2 & t_3 & t_4 \ u_2 & u_3 & u_4 \ v_2 & v_3 & v_4} right| $$ and so on?
– ja72
Jul 26 '17 at 14:19
|
show 7 more comments
My answer is in addition to José's and Antinous's answers but maybe somewhat more abstract. In principal their answers are using coordinates whereas I'm trying to do it coordinate-free.
What you are looking for is the wedge or exterior product.
The exterior power $bigwedge^k(V)$ of some vector space $V$ is the quotient of the tensor product $bigotimes^k(V)$ by the relation $votimes v$.
To be somewhat more concrete and less abstract, this just means that for any vector $vin V$ the wedge product $vwedge v=0inbigwedge^2(V)$. Whenever you wedge vectors together, the result equals zero if at least two of the factors are linearly dependent.
Think of what happens to the cross product in $mathbb{R}^3$.
In fact, let $e_1,e_2,ldots,e_n$ be a basis of an inner product space $V$. Then $e_{i_1}wedge e_{i_2}wedge ldots wedge e_{i_k}$ is a basis for $bigwedge^k(V)$ where $1leq i_1 < i_2 < ldots < i_kleq n$.
If $V=mathbb{R}^3$ then $v wedge w$ equals $v times w$ up to signs of the entries. This seems a bit obscure because technically $vwedge w$ should be an element of $bigwedge^2(mathbb{R}^3)$. However, the latter vector space is isomorphic to $mathbb{R}^3$. In fact, this relation is true for all exterior powers given an orientation on the vector space.
The isomorphism is called the Hodge star operator.
It says that there is an isomorphism $starcolonbigwedge^{n-k}(V)tobigwedge^{k}(V)$. This map operators on a $(n-k)$-wedge $beta$ via the relation
$$
alpha wedge beta = langle alpha,starbeta rangle ,omega
$$
where $alphainbigwedge^{k}(V)$, $omega$ is an orientation form on $V$ and $langle cdot,cdot rangle$ is the induced inner product on $bigwedge^{k}(V)$ (see wiki). Notice that the wiki-page defines the relation the other way around.
How does all this answer your question you ask?
Well, let us take $k=1$ and $V=mathbb{R}^n$. Then the Hodge star isomorphism identifies the spaces $bigwedge^{n-1}(mathbb{R}^n)$ and $bigwedge^{1}(mathbb{R}^n)=mathbb{R}^n$. This is good because you originally wanted to say something about orthogonality between a set of $n-1$ linearly indepedent vectors $v_1,v_2,ldots,v_{n-1}$ and their "cross product".
Now let us exactly do that and set $beta :=v_1 wedge v_2 wedge ldots wedge v_{n-1}inbigwedge^{n-1}(mathbb{R}^n)$. Then the image $starbeta = star(v_1 wedge v_2 wedge ldots wedge v_{n-1})$ is a regular vector in $mathbb{R}^n$ and the defining condition above implies for $alpha=v_iinmathbb{R}^n=bigwedge^{1}(mathbb{R}^n)$
$$
v_i wedge (v_1 wedge v_2 wedge ldots wedge v_{n-1}) = alpha wedge beta = langle alpha,starbeta rangle ,omega = langle v_i,starbeta rangle ,omega.
$$
However, the left hand side equals zero for $i=1,2,ldots,n-1$, so that the vector $starbeta$ is orthogonal to all vectors $v_1,v_2,ldots,v_{n-1}$ which is what you asked for. So you might want to define the cross product of $n-1$ vectors as $v_1 times v_2 times ldots times v_{n-1} := star(v_1 wedge v_2 wedge ldots wedge v_{n-1})$.
Maybe keep in mind that the other two answers implicitly use the Hodge star operation (and also a basis) to compute the "cross product in higher dimension" through the formal determinant which is encoded in the use of the wedge product here.
2
So concretely, how do we actually know what the hodge star of a $k$-blade is? For example, work in $4$-space with the standard orientation. Suppose we want to know $star(v_1 wedge v_3).$ If I understand correctly, it's either $v_2 wedge v_4$ or else $v_4 wedge v_2$. How do we know which one?
– goblin
Jul 25 '17 at 13:32
It depends on the orientation that you choose for your vector space. Let's say $v_1,v_2,v_3,v_4$ form an oriented basis for $V$ (that is, $omega = v_1wedge v_2wedge v_3wedge v_4$) then $star(v_1wedge v_3)=v_4wedge v_2$. This can be seen using the defining relation for $alpha=v_iwedge v_j$ cycling through all possible combinations $(i,j)$. This is what they say on the wiki-page linked above in the section "Computation of the Hodge star" albeit expressed a little bit complicated in my opinion.
– Sven Pistre
Jul 25 '17 at 14:25
Of all combinations $(i,j)$ only $(2,4)$ and $(4,2)$ remain (because otherwise the left hand side equals zero). Then you assume $star(v_1wedge v_3)=v_kwedge v_l$ and think about which combinations for $(k,l)$ remain on the right hand side of the def. relation. Then you will see that the only possible one is $(k,l)=(4,2)$. To see the last part, look at the definition of the induced scalar product on $bigwedge^2(V)$.
– Sven Pistre
Jul 25 '17 at 14:28
And also, as I forgot to mention this, $v_2wedge v_4=-v_4wedge v_2$. Changing the position of two vectors in a $k$-wedge just changes the sign. So really it only depends on the chosen orientation (or "right-handedness") of your vector space.
– Sven Pistre
Jul 25 '17 at 14:49
2
@étale-cohomology The hodge star depends on a choice of inner product and orientation. So it is not canonical. I don't think that you can identify them canonically.
– Sven Pistre
Jul 25 '17 at 17:42
|
show 9 more comments
You can work out the cross product $p$ in $n$-dimensions using the following:
$$p=detleft(begin{array}{lllll}e_1&x_1&y_1&cdots&z_1\e_2&x_2&y_2&cdots&z_2\vdots&vdots&vdots&ddots&vdots\e_n&x_n&y_n&cdots&z_nend{array}right),$$
where $det$ is the formal determinant of the matrix, the $e_i$ are the base vectors (e.g. $hat{i},hat{j},hat{k}$, etc), and $x,y,ldots,z$ are the $n-1$ vectors you wish to "cross".
You will find that $xcdot p=ycdot p=cdots=zcdot p=0$.
It's wonderful the determinant produces a vector with this property.
Are there any requirements on the basis vectors $e_1, ..., e_n$? Like, do they need to form an orthonormal basis, or something?
– étale-cohomology
Jul 25 '17 at 17:01
1
@étale-cohomology they are the base vectors. By definition they are all orthonormal.
– The Great Duck
Jul 25 '17 at 20:31
1
Yeah but do they have to be? Can't we take any other basis?
– étale-cohomology
Jul 25 '17 at 20:35
By changing the basis to $widetilde e_i$ you will have to change the vector entries to the coefficients $widetilde x_i$ in the basis expansion for the new basis. Remember that the above is only a formal determinant as this is not actually a matrix (since the first column consists of entries that are vectors themselves). So it does not matter if the basis is orthonormal or not but you will have to adjust your formal determinant formula.
– Sven Pistre
Jul 25 '17 at 21:48
(+1) This is the logical extension of José Carlos Santos' answer to $mathbb{R}^n$ (at first, this is what I thought he had given, but now I see his only covers $mathbb{R}^4$).
– robjohn♦
Jul 25 '17 at 22:53
|
show 3 more comments
Yes, and apart from other answers an interesting approach to think about it is using Clifford's algebra.
This can introduce you the basic concept in a nonrigorous but approachable manner.
https://slehar.wordpress.com/2014/03/18/clifford-algebra-a-visual-introduction/
1
Thank you for your answer, however that article is extremely long and it's difficult to find a Clifford-style answer to my question by reading through it. Can I ask you to write up some details on how to compute an actual cross product using Clifford approach's?
– goblin
Jul 26 '17 at 8:36
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2371022%2fcross-product-in-higher-dimensions%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
Yes. It's just like in dimension $3$: if your vectors are $(t_1,t_2,t_3,t_4)$, $(u_1,u_2,u_3,u_4)$, and $(v_1,v_2,v_3,v_4)$, compute the formal determinant:$$begin{vmatrix}t_1&t_2&t_3&t_4\u_1&u_2&u_3&u_4\v_1&v_2&v_3&v_4\e_1&e_2&e_3&e_4end{vmatrix}.$$ You then see $(e_1,e_2,e_3,e_4)$ as the canonical basis of $mathbb{R}^4$. Then the previous determinant is $(alpha_1,alpha_2,alpha_3,alpha_4)$ withbegin{align*}alpha_1&=t_4u_3v_2-t_3u_4v_2-t_4u_2v_3+t_2u_4v_3+t_3u_2v_4-t_2u_3v_4\alpha_2&=-t_4u_3v_1+t_3u_4v_1+t_4u_1v_3-t_1u_4v_3-t_3u_1v_4+t_1u_3v_4\alpha_3&=t_4u_2v_1-t_2u_4v_1-t_4u_1v_2+t_1u_4v_2+t_2u_1v_4-t_1u_2v_4\alpha_4&=-t_3u_2v_1+t_2u_3v_1+t_3u_1v_2-t_1u_3v_2-t_2u_1v_3+t_1u_2v_3end{align*}It's a vector orthogonal to the other three.
I followed a suggestion taken from the comments on this answer: to put the entries $e_1$, $e_2$, $e_3$, and $e_4$ at the bottom. It makes no difference in even dimension, but it produces the natural sign in even dimension.
Following another suggestion, I would like to add this remark:$$alpha_1=-begin{vmatrix}t_2&t_3&t_4\u_2&u_3&u_4\v_2&v_3&v_4end{vmatrix}text{, }alpha_2=begin{vmatrix}t_1&t_3&t_4\u_1&u_3&u_4\v_1&v_3&v_4end{vmatrix}text{, }alpha_3=-begin{vmatrix}t_1&t_2&t_4\u_1&u_2&u_4\v_1&v_2&v_4end{vmatrix}text{ and }alpha_4=begin{vmatrix}t_1&t_2&t_3\u_1&u_2&u_3\v_1&v_2&v_3\end{vmatrix}.$$
1
Very lucid and appreciable answer.thanks
– Arpit Yadav
Jul 25 '17 at 10:18
2
I'm pretty sure the row of basis vectors should be on the bottom to get the right-handedness correct; only in odd dimensions can this row be moved to the top without a change in sign whilst keeping the vectors in the same order.
– lastresort
Jul 25 '17 at 18:06
(+1) This is the method I've used in a few answers. The formula is easy to remember!
– robjohn♦
Jul 25 '17 at 22:50
@lastresort Nice remark. I shall edit my answer taking that into account.
– José Carlos Santos
Jul 25 '17 at 22:54
2
Is $$alpha_1 ={rm det} left| matrix{ t_2 & t_3 & t_4 \ u_2 & u_3 & u_4 \ v_2 & v_3 & v_4} right| $$ and so on?
– ja72
Jul 26 '17 at 14:19
|
show 7 more comments
Yes. It's just like in dimension $3$: if your vectors are $(t_1,t_2,t_3,t_4)$, $(u_1,u_2,u_3,u_4)$, and $(v_1,v_2,v_3,v_4)$, compute the formal determinant:$$begin{vmatrix}t_1&t_2&t_3&t_4\u_1&u_2&u_3&u_4\v_1&v_2&v_3&v_4\e_1&e_2&e_3&e_4end{vmatrix}.$$ You then see $(e_1,e_2,e_3,e_4)$ as the canonical basis of $mathbb{R}^4$. Then the previous determinant is $(alpha_1,alpha_2,alpha_3,alpha_4)$ withbegin{align*}alpha_1&=t_4u_3v_2-t_3u_4v_2-t_4u_2v_3+t_2u_4v_3+t_3u_2v_4-t_2u_3v_4\alpha_2&=-t_4u_3v_1+t_3u_4v_1+t_4u_1v_3-t_1u_4v_3-t_3u_1v_4+t_1u_3v_4\alpha_3&=t_4u_2v_1-t_2u_4v_1-t_4u_1v_2+t_1u_4v_2+t_2u_1v_4-t_1u_2v_4\alpha_4&=-t_3u_2v_1+t_2u_3v_1+t_3u_1v_2-t_1u_3v_2-t_2u_1v_3+t_1u_2v_3end{align*}It's a vector orthogonal to the other three.
I followed a suggestion taken from the comments on this answer: to put the entries $e_1$, $e_2$, $e_3$, and $e_4$ at the bottom. It makes no difference in even dimension, but it produces the natural sign in even dimension.
Following another suggestion, I would like to add this remark:$$alpha_1=-begin{vmatrix}t_2&t_3&t_4\u_2&u_3&u_4\v_2&v_3&v_4end{vmatrix}text{, }alpha_2=begin{vmatrix}t_1&t_3&t_4\u_1&u_3&u_4\v_1&v_3&v_4end{vmatrix}text{, }alpha_3=-begin{vmatrix}t_1&t_2&t_4\u_1&u_2&u_4\v_1&v_2&v_4end{vmatrix}text{ and }alpha_4=begin{vmatrix}t_1&t_2&t_3\u_1&u_2&u_3\v_1&v_2&v_3\end{vmatrix}.$$
1
Very lucid and appreciable answer.thanks
– Arpit Yadav
Jul 25 '17 at 10:18
2
I'm pretty sure the row of basis vectors should be on the bottom to get the right-handedness correct; only in odd dimensions can this row be moved to the top without a change in sign whilst keeping the vectors in the same order.
– lastresort
Jul 25 '17 at 18:06
(+1) This is the method I've used in a few answers. The formula is easy to remember!
– robjohn♦
Jul 25 '17 at 22:50
@lastresort Nice remark. I shall edit my answer taking that into account.
– José Carlos Santos
Jul 25 '17 at 22:54
2
Is $$alpha_1 ={rm det} left| matrix{ t_2 & t_3 & t_4 \ u_2 & u_3 & u_4 \ v_2 & v_3 & v_4} right| $$ and so on?
– ja72
Jul 26 '17 at 14:19
|
show 7 more comments
Yes. It's just like in dimension $3$: if your vectors are $(t_1,t_2,t_3,t_4)$, $(u_1,u_2,u_3,u_4)$, and $(v_1,v_2,v_3,v_4)$, compute the formal determinant:$$begin{vmatrix}t_1&t_2&t_3&t_4\u_1&u_2&u_3&u_4\v_1&v_2&v_3&v_4\e_1&e_2&e_3&e_4end{vmatrix}.$$ You then see $(e_1,e_2,e_3,e_4)$ as the canonical basis of $mathbb{R}^4$. Then the previous determinant is $(alpha_1,alpha_2,alpha_3,alpha_4)$ withbegin{align*}alpha_1&=t_4u_3v_2-t_3u_4v_2-t_4u_2v_3+t_2u_4v_3+t_3u_2v_4-t_2u_3v_4\alpha_2&=-t_4u_3v_1+t_3u_4v_1+t_4u_1v_3-t_1u_4v_3-t_3u_1v_4+t_1u_3v_4\alpha_3&=t_4u_2v_1-t_2u_4v_1-t_4u_1v_2+t_1u_4v_2+t_2u_1v_4-t_1u_2v_4\alpha_4&=-t_3u_2v_1+t_2u_3v_1+t_3u_1v_2-t_1u_3v_2-t_2u_1v_3+t_1u_2v_3end{align*}It's a vector orthogonal to the other three.
I followed a suggestion taken from the comments on this answer: to put the entries $e_1$, $e_2$, $e_3$, and $e_4$ at the bottom. It makes no difference in even dimension, but it produces the natural sign in even dimension.
Following another suggestion, I would like to add this remark:$$alpha_1=-begin{vmatrix}t_2&t_3&t_4\u_2&u_3&u_4\v_2&v_3&v_4end{vmatrix}text{, }alpha_2=begin{vmatrix}t_1&t_3&t_4\u_1&u_3&u_4\v_1&v_3&v_4end{vmatrix}text{, }alpha_3=-begin{vmatrix}t_1&t_2&t_4\u_1&u_2&u_4\v_1&v_2&v_4end{vmatrix}text{ and }alpha_4=begin{vmatrix}t_1&t_2&t_3\u_1&u_2&u_3\v_1&v_2&v_3\end{vmatrix}.$$
Yes. It's just like in dimension $3$: if your vectors are $(t_1,t_2,t_3,t_4)$, $(u_1,u_2,u_3,u_4)$, and $(v_1,v_2,v_3,v_4)$, compute the formal determinant:$$begin{vmatrix}t_1&t_2&t_3&t_4\u_1&u_2&u_3&u_4\v_1&v_2&v_3&v_4\e_1&e_2&e_3&e_4end{vmatrix}.$$ You then see $(e_1,e_2,e_3,e_4)$ as the canonical basis of $mathbb{R}^4$. Then the previous determinant is $(alpha_1,alpha_2,alpha_3,alpha_4)$ withbegin{align*}alpha_1&=t_4u_3v_2-t_3u_4v_2-t_4u_2v_3+t_2u_4v_3+t_3u_2v_4-t_2u_3v_4\alpha_2&=-t_4u_3v_1+t_3u_4v_1+t_4u_1v_3-t_1u_4v_3-t_3u_1v_4+t_1u_3v_4\alpha_3&=t_4u_2v_1-t_2u_4v_1-t_4u_1v_2+t_1u_4v_2+t_2u_1v_4-t_1u_2v_4\alpha_4&=-t_3u_2v_1+t_2u_3v_1+t_3u_1v_2-t_1u_3v_2-t_2u_1v_3+t_1u_2v_3end{align*}It's a vector orthogonal to the other three.
I followed a suggestion taken from the comments on this answer: to put the entries $e_1$, $e_2$, $e_3$, and $e_4$ at the bottom. It makes no difference in even dimension, but it produces the natural sign in even dimension.
Following another suggestion, I would like to add this remark:$$alpha_1=-begin{vmatrix}t_2&t_3&t_4\u_2&u_3&u_4\v_2&v_3&v_4end{vmatrix}text{, }alpha_2=begin{vmatrix}t_1&t_3&t_4\u_1&u_3&u_4\v_1&v_3&v_4end{vmatrix}text{, }alpha_3=-begin{vmatrix}t_1&t_2&t_4\u_1&u_2&u_4\v_1&v_2&v_4end{vmatrix}text{ and }alpha_4=begin{vmatrix}t_1&t_2&t_3\u_1&u_2&u_3\v_1&v_2&v_3\end{vmatrix}.$$
edited Aug 29 at 18:11
answered Jul 25 '17 at 10:15
José Carlos Santos
150k22120221
150k22120221
1
Very lucid and appreciable answer.thanks
– Arpit Yadav
Jul 25 '17 at 10:18
2
I'm pretty sure the row of basis vectors should be on the bottom to get the right-handedness correct; only in odd dimensions can this row be moved to the top without a change in sign whilst keeping the vectors in the same order.
– lastresort
Jul 25 '17 at 18:06
(+1) This is the method I've used in a few answers. The formula is easy to remember!
– robjohn♦
Jul 25 '17 at 22:50
@lastresort Nice remark. I shall edit my answer taking that into account.
– José Carlos Santos
Jul 25 '17 at 22:54
2
Is $$alpha_1 ={rm det} left| matrix{ t_2 & t_3 & t_4 \ u_2 & u_3 & u_4 \ v_2 & v_3 & v_4} right| $$ and so on?
– ja72
Jul 26 '17 at 14:19
|
show 7 more comments
1
Very lucid and appreciable answer.thanks
– Arpit Yadav
Jul 25 '17 at 10:18
2
I'm pretty sure the row of basis vectors should be on the bottom to get the right-handedness correct; only in odd dimensions can this row be moved to the top without a change in sign whilst keeping the vectors in the same order.
– lastresort
Jul 25 '17 at 18:06
(+1) This is the method I've used in a few answers. The formula is easy to remember!
– robjohn♦
Jul 25 '17 at 22:50
@lastresort Nice remark. I shall edit my answer taking that into account.
– José Carlos Santos
Jul 25 '17 at 22:54
2
Is $$alpha_1 ={rm det} left| matrix{ t_2 & t_3 & t_4 \ u_2 & u_3 & u_4 \ v_2 & v_3 & v_4} right| $$ and so on?
– ja72
Jul 26 '17 at 14:19
1
1
Very lucid and appreciable answer.thanks
– Arpit Yadav
Jul 25 '17 at 10:18
Very lucid and appreciable answer.thanks
– Arpit Yadav
Jul 25 '17 at 10:18
2
2
I'm pretty sure the row of basis vectors should be on the bottom to get the right-handedness correct; only in odd dimensions can this row be moved to the top without a change in sign whilst keeping the vectors in the same order.
– lastresort
Jul 25 '17 at 18:06
I'm pretty sure the row of basis vectors should be on the bottom to get the right-handedness correct; only in odd dimensions can this row be moved to the top without a change in sign whilst keeping the vectors in the same order.
– lastresort
Jul 25 '17 at 18:06
(+1) This is the method I've used in a few answers. The formula is easy to remember!
– robjohn♦
Jul 25 '17 at 22:50
(+1) This is the method I've used in a few answers. The formula is easy to remember!
– robjohn♦
Jul 25 '17 at 22:50
@lastresort Nice remark. I shall edit my answer taking that into account.
– José Carlos Santos
Jul 25 '17 at 22:54
@lastresort Nice remark. I shall edit my answer taking that into account.
– José Carlos Santos
Jul 25 '17 at 22:54
2
2
Is $$alpha_1 ={rm det} left| matrix{ t_2 & t_3 & t_4 \ u_2 & u_3 & u_4 \ v_2 & v_3 & v_4} right| $$ and so on?
– ja72
Jul 26 '17 at 14:19
Is $$alpha_1 ={rm det} left| matrix{ t_2 & t_3 & t_4 \ u_2 & u_3 & u_4 \ v_2 & v_3 & v_4} right| $$ and so on?
– ja72
Jul 26 '17 at 14:19
|
show 7 more comments
My answer is in addition to José's and Antinous's answers but maybe somewhat more abstract. In principal their answers are using coordinates whereas I'm trying to do it coordinate-free.
What you are looking for is the wedge or exterior product.
The exterior power $bigwedge^k(V)$ of some vector space $V$ is the quotient of the tensor product $bigotimes^k(V)$ by the relation $votimes v$.
To be somewhat more concrete and less abstract, this just means that for any vector $vin V$ the wedge product $vwedge v=0inbigwedge^2(V)$. Whenever you wedge vectors together, the result equals zero if at least two of the factors are linearly dependent.
Think of what happens to the cross product in $mathbb{R}^3$.
In fact, let $e_1,e_2,ldots,e_n$ be a basis of an inner product space $V$. Then $e_{i_1}wedge e_{i_2}wedge ldots wedge e_{i_k}$ is a basis for $bigwedge^k(V)$ where $1leq i_1 < i_2 < ldots < i_kleq n$.
If $V=mathbb{R}^3$ then $v wedge w$ equals $v times w$ up to signs of the entries. This seems a bit obscure because technically $vwedge w$ should be an element of $bigwedge^2(mathbb{R}^3)$. However, the latter vector space is isomorphic to $mathbb{R}^3$. In fact, this relation is true for all exterior powers given an orientation on the vector space.
The isomorphism is called the Hodge star operator.
It says that there is an isomorphism $starcolonbigwedge^{n-k}(V)tobigwedge^{k}(V)$. This map operators on a $(n-k)$-wedge $beta$ via the relation
$$
alpha wedge beta = langle alpha,starbeta rangle ,omega
$$
where $alphainbigwedge^{k}(V)$, $omega$ is an orientation form on $V$ and $langle cdot,cdot rangle$ is the induced inner product on $bigwedge^{k}(V)$ (see wiki). Notice that the wiki-page defines the relation the other way around.
How does all this answer your question you ask?
Well, let us take $k=1$ and $V=mathbb{R}^n$. Then the Hodge star isomorphism identifies the spaces $bigwedge^{n-1}(mathbb{R}^n)$ and $bigwedge^{1}(mathbb{R}^n)=mathbb{R}^n$. This is good because you originally wanted to say something about orthogonality between a set of $n-1$ linearly indepedent vectors $v_1,v_2,ldots,v_{n-1}$ and their "cross product".
Now let us exactly do that and set $beta :=v_1 wedge v_2 wedge ldots wedge v_{n-1}inbigwedge^{n-1}(mathbb{R}^n)$. Then the image $starbeta = star(v_1 wedge v_2 wedge ldots wedge v_{n-1})$ is a regular vector in $mathbb{R}^n$ and the defining condition above implies for $alpha=v_iinmathbb{R}^n=bigwedge^{1}(mathbb{R}^n)$
$$
v_i wedge (v_1 wedge v_2 wedge ldots wedge v_{n-1}) = alpha wedge beta = langle alpha,starbeta rangle ,omega = langle v_i,starbeta rangle ,omega.
$$
However, the left hand side equals zero for $i=1,2,ldots,n-1$, so that the vector $starbeta$ is orthogonal to all vectors $v_1,v_2,ldots,v_{n-1}$ which is what you asked for. So you might want to define the cross product of $n-1$ vectors as $v_1 times v_2 times ldots times v_{n-1} := star(v_1 wedge v_2 wedge ldots wedge v_{n-1})$.
Maybe keep in mind that the other two answers implicitly use the Hodge star operation (and also a basis) to compute the "cross product in higher dimension" through the formal determinant which is encoded in the use of the wedge product here.
2
So concretely, how do we actually know what the hodge star of a $k$-blade is? For example, work in $4$-space with the standard orientation. Suppose we want to know $star(v_1 wedge v_3).$ If I understand correctly, it's either $v_2 wedge v_4$ or else $v_4 wedge v_2$. How do we know which one?
– goblin
Jul 25 '17 at 13:32
It depends on the orientation that you choose for your vector space. Let's say $v_1,v_2,v_3,v_4$ form an oriented basis for $V$ (that is, $omega = v_1wedge v_2wedge v_3wedge v_4$) then $star(v_1wedge v_3)=v_4wedge v_2$. This can be seen using the defining relation for $alpha=v_iwedge v_j$ cycling through all possible combinations $(i,j)$. This is what they say on the wiki-page linked above in the section "Computation of the Hodge star" albeit expressed a little bit complicated in my opinion.
– Sven Pistre
Jul 25 '17 at 14:25
Of all combinations $(i,j)$ only $(2,4)$ and $(4,2)$ remain (because otherwise the left hand side equals zero). Then you assume $star(v_1wedge v_3)=v_kwedge v_l$ and think about which combinations for $(k,l)$ remain on the right hand side of the def. relation. Then you will see that the only possible one is $(k,l)=(4,2)$. To see the last part, look at the definition of the induced scalar product on $bigwedge^2(V)$.
– Sven Pistre
Jul 25 '17 at 14:28
And also, as I forgot to mention this, $v_2wedge v_4=-v_4wedge v_2$. Changing the position of two vectors in a $k$-wedge just changes the sign. So really it only depends on the chosen orientation (or "right-handedness") of your vector space.
– Sven Pistre
Jul 25 '17 at 14:49
2
@étale-cohomology The hodge star depends on a choice of inner product and orientation. So it is not canonical. I don't think that you can identify them canonically.
– Sven Pistre
Jul 25 '17 at 17:42
|
show 9 more comments
My answer is in addition to José's and Antinous's answers but maybe somewhat more abstract. In principal their answers are using coordinates whereas I'm trying to do it coordinate-free.
What you are looking for is the wedge or exterior product.
The exterior power $bigwedge^k(V)$ of some vector space $V$ is the quotient of the tensor product $bigotimes^k(V)$ by the relation $votimes v$.
To be somewhat more concrete and less abstract, this just means that for any vector $vin V$ the wedge product $vwedge v=0inbigwedge^2(V)$. Whenever you wedge vectors together, the result equals zero if at least two of the factors are linearly dependent.
Think of what happens to the cross product in $mathbb{R}^3$.
In fact, let $e_1,e_2,ldots,e_n$ be a basis of an inner product space $V$. Then $e_{i_1}wedge e_{i_2}wedge ldots wedge e_{i_k}$ is a basis for $bigwedge^k(V)$ where $1leq i_1 < i_2 < ldots < i_kleq n$.
If $V=mathbb{R}^3$ then $v wedge w$ equals $v times w$ up to signs of the entries. This seems a bit obscure because technically $vwedge w$ should be an element of $bigwedge^2(mathbb{R}^3)$. However, the latter vector space is isomorphic to $mathbb{R}^3$. In fact, this relation is true for all exterior powers given an orientation on the vector space.
The isomorphism is called the Hodge star operator.
It says that there is an isomorphism $starcolonbigwedge^{n-k}(V)tobigwedge^{k}(V)$. This map operators on a $(n-k)$-wedge $beta$ via the relation
$$
alpha wedge beta = langle alpha,starbeta rangle ,omega
$$
where $alphainbigwedge^{k}(V)$, $omega$ is an orientation form on $V$ and $langle cdot,cdot rangle$ is the induced inner product on $bigwedge^{k}(V)$ (see wiki). Notice that the wiki-page defines the relation the other way around.
How does all this answer your question you ask?
Well, let us take $k=1$ and $V=mathbb{R}^n$. Then the Hodge star isomorphism identifies the spaces $bigwedge^{n-1}(mathbb{R}^n)$ and $bigwedge^{1}(mathbb{R}^n)=mathbb{R}^n$. This is good because you originally wanted to say something about orthogonality between a set of $n-1$ linearly indepedent vectors $v_1,v_2,ldots,v_{n-1}$ and their "cross product".
Now let us exactly do that and set $beta :=v_1 wedge v_2 wedge ldots wedge v_{n-1}inbigwedge^{n-1}(mathbb{R}^n)$. Then the image $starbeta = star(v_1 wedge v_2 wedge ldots wedge v_{n-1})$ is a regular vector in $mathbb{R}^n$ and the defining condition above implies for $alpha=v_iinmathbb{R}^n=bigwedge^{1}(mathbb{R}^n)$
$$
v_i wedge (v_1 wedge v_2 wedge ldots wedge v_{n-1}) = alpha wedge beta = langle alpha,starbeta rangle ,omega = langle v_i,starbeta rangle ,omega.
$$
However, the left hand side equals zero for $i=1,2,ldots,n-1$, so that the vector $starbeta$ is orthogonal to all vectors $v_1,v_2,ldots,v_{n-1}$ which is what you asked for. So you might want to define the cross product of $n-1$ vectors as $v_1 times v_2 times ldots times v_{n-1} := star(v_1 wedge v_2 wedge ldots wedge v_{n-1})$.
Maybe keep in mind that the other two answers implicitly use the Hodge star operation (and also a basis) to compute the "cross product in higher dimension" through the formal determinant which is encoded in the use of the wedge product here.
2
So concretely, how do we actually know what the hodge star of a $k$-blade is? For example, work in $4$-space with the standard orientation. Suppose we want to know $star(v_1 wedge v_3).$ If I understand correctly, it's either $v_2 wedge v_4$ or else $v_4 wedge v_2$. How do we know which one?
– goblin
Jul 25 '17 at 13:32
It depends on the orientation that you choose for your vector space. Let's say $v_1,v_2,v_3,v_4$ form an oriented basis for $V$ (that is, $omega = v_1wedge v_2wedge v_3wedge v_4$) then $star(v_1wedge v_3)=v_4wedge v_2$. This can be seen using the defining relation for $alpha=v_iwedge v_j$ cycling through all possible combinations $(i,j)$. This is what they say on the wiki-page linked above in the section "Computation of the Hodge star" albeit expressed a little bit complicated in my opinion.
– Sven Pistre
Jul 25 '17 at 14:25
Of all combinations $(i,j)$ only $(2,4)$ and $(4,2)$ remain (because otherwise the left hand side equals zero). Then you assume $star(v_1wedge v_3)=v_kwedge v_l$ and think about which combinations for $(k,l)$ remain on the right hand side of the def. relation. Then you will see that the only possible one is $(k,l)=(4,2)$. To see the last part, look at the definition of the induced scalar product on $bigwedge^2(V)$.
– Sven Pistre
Jul 25 '17 at 14:28
And also, as I forgot to mention this, $v_2wedge v_4=-v_4wedge v_2$. Changing the position of two vectors in a $k$-wedge just changes the sign. So really it only depends on the chosen orientation (or "right-handedness") of your vector space.
– Sven Pistre
Jul 25 '17 at 14:49
2
@étale-cohomology The hodge star depends on a choice of inner product and orientation. So it is not canonical. I don't think that you can identify them canonically.
– Sven Pistre
Jul 25 '17 at 17:42
|
show 9 more comments
My answer is in addition to José's and Antinous's answers but maybe somewhat more abstract. In principal their answers are using coordinates whereas I'm trying to do it coordinate-free.
What you are looking for is the wedge or exterior product.
The exterior power $bigwedge^k(V)$ of some vector space $V$ is the quotient of the tensor product $bigotimes^k(V)$ by the relation $votimes v$.
To be somewhat more concrete and less abstract, this just means that for any vector $vin V$ the wedge product $vwedge v=0inbigwedge^2(V)$. Whenever you wedge vectors together, the result equals zero if at least two of the factors are linearly dependent.
Think of what happens to the cross product in $mathbb{R}^3$.
In fact, let $e_1,e_2,ldots,e_n$ be a basis of an inner product space $V$. Then $e_{i_1}wedge e_{i_2}wedge ldots wedge e_{i_k}$ is a basis for $bigwedge^k(V)$ where $1leq i_1 < i_2 < ldots < i_kleq n$.
If $V=mathbb{R}^3$ then $v wedge w$ equals $v times w$ up to signs of the entries. This seems a bit obscure because technically $vwedge w$ should be an element of $bigwedge^2(mathbb{R}^3)$. However, the latter vector space is isomorphic to $mathbb{R}^3$. In fact, this relation is true for all exterior powers given an orientation on the vector space.
The isomorphism is called the Hodge star operator.
It says that there is an isomorphism $starcolonbigwedge^{n-k}(V)tobigwedge^{k}(V)$. This map operators on a $(n-k)$-wedge $beta$ via the relation
$$
alpha wedge beta = langle alpha,starbeta rangle ,omega
$$
where $alphainbigwedge^{k}(V)$, $omega$ is an orientation form on $V$ and $langle cdot,cdot rangle$ is the induced inner product on $bigwedge^{k}(V)$ (see wiki). Notice that the wiki-page defines the relation the other way around.
How does all this answer your question you ask?
Well, let us take $k=1$ and $V=mathbb{R}^n$. Then the Hodge star isomorphism identifies the spaces $bigwedge^{n-1}(mathbb{R}^n)$ and $bigwedge^{1}(mathbb{R}^n)=mathbb{R}^n$. This is good because you originally wanted to say something about orthogonality between a set of $n-1$ linearly indepedent vectors $v_1,v_2,ldots,v_{n-1}$ and their "cross product".
Now let us exactly do that and set $beta :=v_1 wedge v_2 wedge ldots wedge v_{n-1}inbigwedge^{n-1}(mathbb{R}^n)$. Then the image $starbeta = star(v_1 wedge v_2 wedge ldots wedge v_{n-1})$ is a regular vector in $mathbb{R}^n$ and the defining condition above implies for $alpha=v_iinmathbb{R}^n=bigwedge^{1}(mathbb{R}^n)$
$$
v_i wedge (v_1 wedge v_2 wedge ldots wedge v_{n-1}) = alpha wedge beta = langle alpha,starbeta rangle ,omega = langle v_i,starbeta rangle ,omega.
$$
However, the left hand side equals zero for $i=1,2,ldots,n-1$, so that the vector $starbeta$ is orthogonal to all vectors $v_1,v_2,ldots,v_{n-1}$ which is what you asked for. So you might want to define the cross product of $n-1$ vectors as $v_1 times v_2 times ldots times v_{n-1} := star(v_1 wedge v_2 wedge ldots wedge v_{n-1})$.
Maybe keep in mind that the other two answers implicitly use the Hodge star operation (and also a basis) to compute the "cross product in higher dimension" through the formal determinant which is encoded in the use of the wedge product here.
My answer is in addition to José's and Antinous's answers but maybe somewhat more abstract. In principal their answers are using coordinates whereas I'm trying to do it coordinate-free.
What you are looking for is the wedge or exterior product.
The exterior power $bigwedge^k(V)$ of some vector space $V$ is the quotient of the tensor product $bigotimes^k(V)$ by the relation $votimes v$.
To be somewhat more concrete and less abstract, this just means that for any vector $vin V$ the wedge product $vwedge v=0inbigwedge^2(V)$. Whenever you wedge vectors together, the result equals zero if at least two of the factors are linearly dependent.
Think of what happens to the cross product in $mathbb{R}^3$.
In fact, let $e_1,e_2,ldots,e_n$ be a basis of an inner product space $V$. Then $e_{i_1}wedge e_{i_2}wedge ldots wedge e_{i_k}$ is a basis for $bigwedge^k(V)$ where $1leq i_1 < i_2 < ldots < i_kleq n$.
If $V=mathbb{R}^3$ then $v wedge w$ equals $v times w$ up to signs of the entries. This seems a bit obscure because technically $vwedge w$ should be an element of $bigwedge^2(mathbb{R}^3)$. However, the latter vector space is isomorphic to $mathbb{R}^3$. In fact, this relation is true for all exterior powers given an orientation on the vector space.
The isomorphism is called the Hodge star operator.
It says that there is an isomorphism $starcolonbigwedge^{n-k}(V)tobigwedge^{k}(V)$. This map operators on a $(n-k)$-wedge $beta$ via the relation
$$
alpha wedge beta = langle alpha,starbeta rangle ,omega
$$
where $alphainbigwedge^{k}(V)$, $omega$ is an orientation form on $V$ and $langle cdot,cdot rangle$ is the induced inner product on $bigwedge^{k}(V)$ (see wiki). Notice that the wiki-page defines the relation the other way around.
How does all this answer your question you ask?
Well, let us take $k=1$ and $V=mathbb{R}^n$. Then the Hodge star isomorphism identifies the spaces $bigwedge^{n-1}(mathbb{R}^n)$ and $bigwedge^{1}(mathbb{R}^n)=mathbb{R}^n$. This is good because you originally wanted to say something about orthogonality between a set of $n-1$ linearly indepedent vectors $v_1,v_2,ldots,v_{n-1}$ and their "cross product".
Now let us exactly do that and set $beta :=v_1 wedge v_2 wedge ldots wedge v_{n-1}inbigwedge^{n-1}(mathbb{R}^n)$. Then the image $starbeta = star(v_1 wedge v_2 wedge ldots wedge v_{n-1})$ is a regular vector in $mathbb{R}^n$ and the defining condition above implies for $alpha=v_iinmathbb{R}^n=bigwedge^{1}(mathbb{R}^n)$
$$
v_i wedge (v_1 wedge v_2 wedge ldots wedge v_{n-1}) = alpha wedge beta = langle alpha,starbeta rangle ,omega = langle v_i,starbeta rangle ,omega.
$$
However, the left hand side equals zero for $i=1,2,ldots,n-1$, so that the vector $starbeta$ is orthogonal to all vectors $v_1,v_2,ldots,v_{n-1}$ which is what you asked for. So you might want to define the cross product of $n-1$ vectors as $v_1 times v_2 times ldots times v_{n-1} := star(v_1 wedge v_2 wedge ldots wedge v_{n-1})$.
Maybe keep in mind that the other two answers implicitly use the Hodge star operation (and also a basis) to compute the "cross product in higher dimension" through the formal determinant which is encoded in the use of the wedge product here.
edited Jul 27 '17 at 18:51
José Carlos Santos
150k22120221
150k22120221
answered Jul 25 '17 at 10:57
Sven Pistre
504315
504315
2
So concretely, how do we actually know what the hodge star of a $k$-blade is? For example, work in $4$-space with the standard orientation. Suppose we want to know $star(v_1 wedge v_3).$ If I understand correctly, it's either $v_2 wedge v_4$ or else $v_4 wedge v_2$. How do we know which one?
– goblin
Jul 25 '17 at 13:32
It depends on the orientation that you choose for your vector space. Let's say $v_1,v_2,v_3,v_4$ form an oriented basis for $V$ (that is, $omega = v_1wedge v_2wedge v_3wedge v_4$) then $star(v_1wedge v_3)=v_4wedge v_2$. This can be seen using the defining relation for $alpha=v_iwedge v_j$ cycling through all possible combinations $(i,j)$. This is what they say on the wiki-page linked above in the section "Computation of the Hodge star" albeit expressed a little bit complicated in my opinion.
– Sven Pistre
Jul 25 '17 at 14:25
Of all combinations $(i,j)$ only $(2,4)$ and $(4,2)$ remain (because otherwise the left hand side equals zero). Then you assume $star(v_1wedge v_3)=v_kwedge v_l$ and think about which combinations for $(k,l)$ remain on the right hand side of the def. relation. Then you will see that the only possible one is $(k,l)=(4,2)$. To see the last part, look at the definition of the induced scalar product on $bigwedge^2(V)$.
– Sven Pistre
Jul 25 '17 at 14:28
And also, as I forgot to mention this, $v_2wedge v_4=-v_4wedge v_2$. Changing the position of two vectors in a $k$-wedge just changes the sign. So really it only depends on the chosen orientation (or "right-handedness") of your vector space.
– Sven Pistre
Jul 25 '17 at 14:49
2
@étale-cohomology The hodge star depends on a choice of inner product and orientation. So it is not canonical. I don't think that you can identify them canonically.
– Sven Pistre
Jul 25 '17 at 17:42
|
show 9 more comments
2
So concretely, how do we actually know what the hodge star of a $k$-blade is? For example, work in $4$-space with the standard orientation. Suppose we want to know $star(v_1 wedge v_3).$ If I understand correctly, it's either $v_2 wedge v_4$ or else $v_4 wedge v_2$. How do we know which one?
– goblin
Jul 25 '17 at 13:32
It depends on the orientation that you choose for your vector space. Let's say $v_1,v_2,v_3,v_4$ form an oriented basis for $V$ (that is, $omega = v_1wedge v_2wedge v_3wedge v_4$) then $star(v_1wedge v_3)=v_4wedge v_2$. This can be seen using the defining relation for $alpha=v_iwedge v_j$ cycling through all possible combinations $(i,j)$. This is what they say on the wiki-page linked above in the section "Computation of the Hodge star" albeit expressed a little bit complicated in my opinion.
– Sven Pistre
Jul 25 '17 at 14:25
Of all combinations $(i,j)$ only $(2,4)$ and $(4,2)$ remain (because otherwise the left hand side equals zero). Then you assume $star(v_1wedge v_3)=v_kwedge v_l$ and think about which combinations for $(k,l)$ remain on the right hand side of the def. relation. Then you will see that the only possible one is $(k,l)=(4,2)$. To see the last part, look at the definition of the induced scalar product on $bigwedge^2(V)$.
– Sven Pistre
Jul 25 '17 at 14:28
And also, as I forgot to mention this, $v_2wedge v_4=-v_4wedge v_2$. Changing the position of two vectors in a $k$-wedge just changes the sign. So really it only depends on the chosen orientation (or "right-handedness") of your vector space.
– Sven Pistre
Jul 25 '17 at 14:49
2
@étale-cohomology The hodge star depends on a choice of inner product and orientation. So it is not canonical. I don't think that you can identify them canonically.
– Sven Pistre
Jul 25 '17 at 17:42
2
2
So concretely, how do we actually know what the hodge star of a $k$-blade is? For example, work in $4$-space with the standard orientation. Suppose we want to know $star(v_1 wedge v_3).$ If I understand correctly, it's either $v_2 wedge v_4$ or else $v_4 wedge v_2$. How do we know which one?
– goblin
Jul 25 '17 at 13:32
So concretely, how do we actually know what the hodge star of a $k$-blade is? For example, work in $4$-space with the standard orientation. Suppose we want to know $star(v_1 wedge v_3).$ If I understand correctly, it's either $v_2 wedge v_4$ or else $v_4 wedge v_2$. How do we know which one?
– goblin
Jul 25 '17 at 13:32
It depends on the orientation that you choose for your vector space. Let's say $v_1,v_2,v_3,v_4$ form an oriented basis for $V$ (that is, $omega = v_1wedge v_2wedge v_3wedge v_4$) then $star(v_1wedge v_3)=v_4wedge v_2$. This can be seen using the defining relation for $alpha=v_iwedge v_j$ cycling through all possible combinations $(i,j)$. This is what they say on the wiki-page linked above in the section "Computation of the Hodge star" albeit expressed a little bit complicated in my opinion.
– Sven Pistre
Jul 25 '17 at 14:25
It depends on the orientation that you choose for your vector space. Let's say $v_1,v_2,v_3,v_4$ form an oriented basis for $V$ (that is, $omega = v_1wedge v_2wedge v_3wedge v_4$) then $star(v_1wedge v_3)=v_4wedge v_2$. This can be seen using the defining relation for $alpha=v_iwedge v_j$ cycling through all possible combinations $(i,j)$. This is what they say on the wiki-page linked above in the section "Computation of the Hodge star" albeit expressed a little bit complicated in my opinion.
– Sven Pistre
Jul 25 '17 at 14:25
Of all combinations $(i,j)$ only $(2,4)$ and $(4,2)$ remain (because otherwise the left hand side equals zero). Then you assume $star(v_1wedge v_3)=v_kwedge v_l$ and think about which combinations for $(k,l)$ remain on the right hand side of the def. relation. Then you will see that the only possible one is $(k,l)=(4,2)$. To see the last part, look at the definition of the induced scalar product on $bigwedge^2(V)$.
– Sven Pistre
Jul 25 '17 at 14:28
Of all combinations $(i,j)$ only $(2,4)$ and $(4,2)$ remain (because otherwise the left hand side equals zero). Then you assume $star(v_1wedge v_3)=v_kwedge v_l$ and think about which combinations for $(k,l)$ remain on the right hand side of the def. relation. Then you will see that the only possible one is $(k,l)=(4,2)$. To see the last part, look at the definition of the induced scalar product on $bigwedge^2(V)$.
– Sven Pistre
Jul 25 '17 at 14:28
And also, as I forgot to mention this, $v_2wedge v_4=-v_4wedge v_2$. Changing the position of two vectors in a $k$-wedge just changes the sign. So really it only depends on the chosen orientation (or "right-handedness") of your vector space.
– Sven Pistre
Jul 25 '17 at 14:49
And also, as I forgot to mention this, $v_2wedge v_4=-v_4wedge v_2$. Changing the position of two vectors in a $k$-wedge just changes the sign. So really it only depends on the chosen orientation (or "right-handedness") of your vector space.
– Sven Pistre
Jul 25 '17 at 14:49
2
2
@étale-cohomology The hodge star depends on a choice of inner product and orientation. So it is not canonical. I don't think that you can identify them canonically.
– Sven Pistre
Jul 25 '17 at 17:42
@étale-cohomology The hodge star depends on a choice of inner product and orientation. So it is not canonical. I don't think that you can identify them canonically.
– Sven Pistre
Jul 25 '17 at 17:42
|
show 9 more comments
You can work out the cross product $p$ in $n$-dimensions using the following:
$$p=detleft(begin{array}{lllll}e_1&x_1&y_1&cdots&z_1\e_2&x_2&y_2&cdots&z_2\vdots&vdots&vdots&ddots&vdots\e_n&x_n&y_n&cdots&z_nend{array}right),$$
where $det$ is the formal determinant of the matrix, the $e_i$ are the base vectors (e.g. $hat{i},hat{j},hat{k}$, etc), and $x,y,ldots,z$ are the $n-1$ vectors you wish to "cross".
You will find that $xcdot p=ycdot p=cdots=zcdot p=0$.
It's wonderful the determinant produces a vector with this property.
Are there any requirements on the basis vectors $e_1, ..., e_n$? Like, do they need to form an orthonormal basis, or something?
– étale-cohomology
Jul 25 '17 at 17:01
1
@étale-cohomology they are the base vectors. By definition they are all orthonormal.
– The Great Duck
Jul 25 '17 at 20:31
1
Yeah but do they have to be? Can't we take any other basis?
– étale-cohomology
Jul 25 '17 at 20:35
By changing the basis to $widetilde e_i$ you will have to change the vector entries to the coefficients $widetilde x_i$ in the basis expansion for the new basis. Remember that the above is only a formal determinant as this is not actually a matrix (since the first column consists of entries that are vectors themselves). So it does not matter if the basis is orthonormal or not but you will have to adjust your formal determinant formula.
– Sven Pistre
Jul 25 '17 at 21:48
(+1) This is the logical extension of José Carlos Santos' answer to $mathbb{R}^n$ (at first, this is what I thought he had given, but now I see his only covers $mathbb{R}^4$).
– robjohn♦
Jul 25 '17 at 22:53
|
show 3 more comments
You can work out the cross product $p$ in $n$-dimensions using the following:
$$p=detleft(begin{array}{lllll}e_1&x_1&y_1&cdots&z_1\e_2&x_2&y_2&cdots&z_2\vdots&vdots&vdots&ddots&vdots\e_n&x_n&y_n&cdots&z_nend{array}right),$$
where $det$ is the formal determinant of the matrix, the $e_i$ are the base vectors (e.g. $hat{i},hat{j},hat{k}$, etc), and $x,y,ldots,z$ are the $n-1$ vectors you wish to "cross".
You will find that $xcdot p=ycdot p=cdots=zcdot p=0$.
It's wonderful the determinant produces a vector with this property.
Are there any requirements on the basis vectors $e_1, ..., e_n$? Like, do they need to form an orthonormal basis, or something?
– étale-cohomology
Jul 25 '17 at 17:01
1
@étale-cohomology they are the base vectors. By definition they are all orthonormal.
– The Great Duck
Jul 25 '17 at 20:31
1
Yeah but do they have to be? Can't we take any other basis?
– étale-cohomology
Jul 25 '17 at 20:35
By changing the basis to $widetilde e_i$ you will have to change the vector entries to the coefficients $widetilde x_i$ in the basis expansion for the new basis. Remember that the above is only a formal determinant as this is not actually a matrix (since the first column consists of entries that are vectors themselves). So it does not matter if the basis is orthonormal or not but you will have to adjust your formal determinant formula.
– Sven Pistre
Jul 25 '17 at 21:48
(+1) This is the logical extension of José Carlos Santos' answer to $mathbb{R}^n$ (at first, this is what I thought he had given, but now I see his only covers $mathbb{R}^4$).
– robjohn♦
Jul 25 '17 at 22:53
|
show 3 more comments
You can work out the cross product $p$ in $n$-dimensions using the following:
$$p=detleft(begin{array}{lllll}e_1&x_1&y_1&cdots&z_1\e_2&x_2&y_2&cdots&z_2\vdots&vdots&vdots&ddots&vdots\e_n&x_n&y_n&cdots&z_nend{array}right),$$
where $det$ is the formal determinant of the matrix, the $e_i$ are the base vectors (e.g. $hat{i},hat{j},hat{k}$, etc), and $x,y,ldots,z$ are the $n-1$ vectors you wish to "cross".
You will find that $xcdot p=ycdot p=cdots=zcdot p=0$.
It's wonderful the determinant produces a vector with this property.
You can work out the cross product $p$ in $n$-dimensions using the following:
$$p=detleft(begin{array}{lllll}e_1&x_1&y_1&cdots&z_1\e_2&x_2&y_2&cdots&z_2\vdots&vdots&vdots&ddots&vdots\e_n&x_n&y_n&cdots&z_nend{array}right),$$
where $det$ is the formal determinant of the matrix, the $e_i$ are the base vectors (e.g. $hat{i},hat{j},hat{k}$, etc), and $x,y,ldots,z$ are the $n-1$ vectors you wish to "cross".
You will find that $xcdot p=ycdot p=cdots=zcdot p=0$.
It's wonderful the determinant produces a vector with this property.
edited Jul 26 '17 at 22:49
answered Jul 25 '17 at 10:14
Antinous
5,69042051
5,69042051
Are there any requirements on the basis vectors $e_1, ..., e_n$? Like, do they need to form an orthonormal basis, or something?
– étale-cohomology
Jul 25 '17 at 17:01
1
@étale-cohomology they are the base vectors. By definition they are all orthonormal.
– The Great Duck
Jul 25 '17 at 20:31
1
Yeah but do they have to be? Can't we take any other basis?
– étale-cohomology
Jul 25 '17 at 20:35
By changing the basis to $widetilde e_i$ you will have to change the vector entries to the coefficients $widetilde x_i$ in the basis expansion for the new basis. Remember that the above is only a formal determinant as this is not actually a matrix (since the first column consists of entries that are vectors themselves). So it does not matter if the basis is orthonormal or not but you will have to adjust your formal determinant formula.
– Sven Pistre
Jul 25 '17 at 21:48
(+1) This is the logical extension of José Carlos Santos' answer to $mathbb{R}^n$ (at first, this is what I thought he had given, but now I see his only covers $mathbb{R}^4$).
– robjohn♦
Jul 25 '17 at 22:53
|
show 3 more comments
Are there any requirements on the basis vectors $e_1, ..., e_n$? Like, do they need to form an orthonormal basis, or something?
– étale-cohomology
Jul 25 '17 at 17:01
1
@étale-cohomology they are the base vectors. By definition they are all orthonormal.
– The Great Duck
Jul 25 '17 at 20:31
1
Yeah but do they have to be? Can't we take any other basis?
– étale-cohomology
Jul 25 '17 at 20:35
By changing the basis to $widetilde e_i$ you will have to change the vector entries to the coefficients $widetilde x_i$ in the basis expansion for the new basis. Remember that the above is only a formal determinant as this is not actually a matrix (since the first column consists of entries that are vectors themselves). So it does not matter if the basis is orthonormal or not but you will have to adjust your formal determinant formula.
– Sven Pistre
Jul 25 '17 at 21:48
(+1) This is the logical extension of José Carlos Santos' answer to $mathbb{R}^n$ (at first, this is what I thought he had given, but now I see his only covers $mathbb{R}^4$).
– robjohn♦
Jul 25 '17 at 22:53
Are there any requirements on the basis vectors $e_1, ..., e_n$? Like, do they need to form an orthonormal basis, or something?
– étale-cohomology
Jul 25 '17 at 17:01
Are there any requirements on the basis vectors $e_1, ..., e_n$? Like, do they need to form an orthonormal basis, or something?
– étale-cohomology
Jul 25 '17 at 17:01
1
1
@étale-cohomology they are the base vectors. By definition they are all orthonormal.
– The Great Duck
Jul 25 '17 at 20:31
@étale-cohomology they are the base vectors. By definition they are all orthonormal.
– The Great Duck
Jul 25 '17 at 20:31
1
1
Yeah but do they have to be? Can't we take any other basis?
– étale-cohomology
Jul 25 '17 at 20:35
Yeah but do they have to be? Can't we take any other basis?
– étale-cohomology
Jul 25 '17 at 20:35
By changing the basis to $widetilde e_i$ you will have to change the vector entries to the coefficients $widetilde x_i$ in the basis expansion for the new basis. Remember that the above is only a formal determinant as this is not actually a matrix (since the first column consists of entries that are vectors themselves). So it does not matter if the basis is orthonormal or not but you will have to adjust your formal determinant formula.
– Sven Pistre
Jul 25 '17 at 21:48
By changing the basis to $widetilde e_i$ you will have to change the vector entries to the coefficients $widetilde x_i$ in the basis expansion for the new basis. Remember that the above is only a formal determinant as this is not actually a matrix (since the first column consists of entries that are vectors themselves). So it does not matter if the basis is orthonormal or not but you will have to adjust your formal determinant formula.
– Sven Pistre
Jul 25 '17 at 21:48
(+1) This is the logical extension of José Carlos Santos' answer to $mathbb{R}^n$ (at first, this is what I thought he had given, but now I see his only covers $mathbb{R}^4$).
– robjohn♦
Jul 25 '17 at 22:53
(+1) This is the logical extension of José Carlos Santos' answer to $mathbb{R}^n$ (at first, this is what I thought he had given, but now I see his only covers $mathbb{R}^4$).
– robjohn♦
Jul 25 '17 at 22:53
|
show 3 more comments
Yes, and apart from other answers an interesting approach to think about it is using Clifford's algebra.
This can introduce you the basic concept in a nonrigorous but approachable manner.
https://slehar.wordpress.com/2014/03/18/clifford-algebra-a-visual-introduction/
1
Thank you for your answer, however that article is extremely long and it's difficult to find a Clifford-style answer to my question by reading through it. Can I ask you to write up some details on how to compute an actual cross product using Clifford approach's?
– goblin
Jul 26 '17 at 8:36
add a comment |
Yes, and apart from other answers an interesting approach to think about it is using Clifford's algebra.
This can introduce you the basic concept in a nonrigorous but approachable manner.
https://slehar.wordpress.com/2014/03/18/clifford-algebra-a-visual-introduction/
1
Thank you for your answer, however that article is extremely long and it's difficult to find a Clifford-style answer to my question by reading through it. Can I ask you to write up some details on how to compute an actual cross product using Clifford approach's?
– goblin
Jul 26 '17 at 8:36
add a comment |
Yes, and apart from other answers an interesting approach to think about it is using Clifford's algebra.
This can introduce you the basic concept in a nonrigorous but approachable manner.
https://slehar.wordpress.com/2014/03/18/clifford-algebra-a-visual-introduction/
Yes, and apart from other answers an interesting approach to think about it is using Clifford's algebra.
This can introduce you the basic concept in a nonrigorous but approachable manner.
https://slehar.wordpress.com/2014/03/18/clifford-algebra-a-visual-introduction/
answered Jul 25 '17 at 17:50
Can Özkan
113
113
1
Thank you for your answer, however that article is extremely long and it's difficult to find a Clifford-style answer to my question by reading through it. Can I ask you to write up some details on how to compute an actual cross product using Clifford approach's?
– goblin
Jul 26 '17 at 8:36
add a comment |
1
Thank you for your answer, however that article is extremely long and it's difficult to find a Clifford-style answer to my question by reading through it. Can I ask you to write up some details on how to compute an actual cross product using Clifford approach's?
– goblin
Jul 26 '17 at 8:36
1
1
Thank you for your answer, however that article is extremely long and it's difficult to find a Clifford-style answer to my question by reading through it. Can I ask you to write up some details on how to compute an actual cross product using Clifford approach's?
– goblin
Jul 26 '17 at 8:36
Thank you for your answer, however that article is extremely long and it's difficult to find a Clifford-style answer to my question by reading through it. Can I ask you to write up some details on how to compute an actual cross product using Clifford approach's?
– goblin
Jul 26 '17 at 8:36
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2371022%2fcross-product-in-higher-dimensions%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
You might want to look at the Gram Schmitt method here :en.wikipedia.org/wiki/Gram%E2%80%93Schmidt_process
– Furrane
Jul 25 '17 at 10:06
2
This construction has nothing to do with cross product, just happen to coincide in dimension 3.
– Miguel
Jul 25 '17 at 10:54
1
A search for "generalized cross product" turns up a number of questions likely to be of interest, including Is the vector cross product only defined for 3D? and Generalized Cross Product. ;) (Not marking as a duplicate because you're better able to judge which question, if any, most nearly matches yours.)
– Andrew D. Hwang
Jul 25 '17 at 17:38
1
You might be interested in the notion of the orthogonal complement. It can give you the vector orthogonal to a given set of $n-1$ independent vectors in $n$-space, like you're asking for $n=4$. But it can also give you $k$ independent vectors orthogonal to a given set of $n-k$ independent vectors in $n$-space. So you can take two vectors in 4-space and find two vectors perpendicular to them and to each other.
– YawarRaza7349
Jul 25 '17 at 17:39
1
en.wikipedia.org/wiki/Seven-dimensional_cross_product
– user57159
Jul 26 '17 at 3:18