How many elements of a 3x3 Rotation matrix are redundant?
Clash Royale CLAN TAG#URR8PPP
up vote
3
down vote
favorite
I read the following and got curious:
A rotation matrix is an array of nine numbers. These are subject to
the six norm and orthogonality constraints, so only three degrees of
freedom are left: if three of the numbers are given, the other six can
be computed from these equations.
I know that the matrix only has three degrees of freedom, but the last statement (emphasis mine) is obviously not literally true. Given these seven elements:
$$ left[ beginarrayrrrr
. & 0 & 0 \
0 & . & 0 \
0 & 0 & 1 endarrayright] $$
There are at least two possible solutions (1, 1 or -1, -1) which gives me valid but different rotation matrices (a rotation of 180° around the Z axis). So, if I can't choose which elements are revealed to me, I apparently need to know eight elements to be certain about the last.
Is there any way to rephrase the quote above which makes it true (are diagonal matrices an exception?). I know for instance, that if I have the following six elements:
$$ left[ beginarrayrrrr
a & b & c \
d & e & f \
. & . & . endarrayright] $$
I can deduce the last row by using the cross product of the first two (and choose a sign based on handedness).
Can I get away with less than these six elements?
matrices rotations
add a comment |Â
up vote
3
down vote
favorite
I read the following and got curious:
A rotation matrix is an array of nine numbers. These are subject to
the six norm and orthogonality constraints, so only three degrees of
freedom are left: if three of the numbers are given, the other six can
be computed from these equations.
I know that the matrix only has three degrees of freedom, but the last statement (emphasis mine) is obviously not literally true. Given these seven elements:
$$ left[ beginarrayrrrr
. & 0 & 0 \
0 & . & 0 \
0 & 0 & 1 endarrayright] $$
There are at least two possible solutions (1, 1 or -1, -1) which gives me valid but different rotation matrices (a rotation of 180° around the Z axis). So, if I can't choose which elements are revealed to me, I apparently need to know eight elements to be certain about the last.
Is there any way to rephrase the quote above which makes it true (are diagonal matrices an exception?). I know for instance, that if I have the following six elements:
$$ left[ beginarrayrrrr
a & b & c \
d & e & f \
. & . & . endarrayright] $$
I can deduce the last row by using the cross product of the first two (and choose a sign based on handedness).
Can I get away with less than these six elements?
matrices rotations
1
I think you are reading the statement backwards. With 3 elements known, the remaining 6 can be computed from $R^intercal R = boldsymbol1$. So given three elements like $beginpmatrix a & b & c \ cdot & cdot & cdot \ cdot & cdot & cdot endpmatrix$ the rest can be computed.
– ja72
Aug 1 at 17:53
I don't think you can say that "the rest can be computed". In your example you have given me one axis out of three. In the second row I can select any vector orthogonal to the first and the third row can be computed. I will end up with different rotations depending on which choice I make.
– bgp2000
Aug 1 at 18:48
I didn't say uniquely computed. You can always find two axes that are orthogonal to $(a,b,c)$ and to each other.
– ja72
Aug 1 at 19:44
add a comment |Â
up vote
3
down vote
favorite
up vote
3
down vote
favorite
I read the following and got curious:
A rotation matrix is an array of nine numbers. These are subject to
the six norm and orthogonality constraints, so only three degrees of
freedom are left: if three of the numbers are given, the other six can
be computed from these equations.
I know that the matrix only has three degrees of freedom, but the last statement (emphasis mine) is obviously not literally true. Given these seven elements:
$$ left[ beginarrayrrrr
. & 0 & 0 \
0 & . & 0 \
0 & 0 & 1 endarrayright] $$
There are at least two possible solutions (1, 1 or -1, -1) which gives me valid but different rotation matrices (a rotation of 180° around the Z axis). So, if I can't choose which elements are revealed to me, I apparently need to know eight elements to be certain about the last.
Is there any way to rephrase the quote above which makes it true (are diagonal matrices an exception?). I know for instance, that if I have the following six elements:
$$ left[ beginarrayrrrr
a & b & c \
d & e & f \
. & . & . endarrayright] $$
I can deduce the last row by using the cross product of the first two (and choose a sign based on handedness).
Can I get away with less than these six elements?
matrices rotations
I read the following and got curious:
A rotation matrix is an array of nine numbers. These are subject to
the six norm and orthogonality constraints, so only three degrees of
freedom are left: if three of the numbers are given, the other six can
be computed from these equations.
I know that the matrix only has three degrees of freedom, but the last statement (emphasis mine) is obviously not literally true. Given these seven elements:
$$ left[ beginarrayrrrr
. & 0 & 0 \
0 & . & 0 \
0 & 0 & 1 endarrayright] $$
There are at least two possible solutions (1, 1 or -1, -1) which gives me valid but different rotation matrices (a rotation of 180° around the Z axis). So, if I can't choose which elements are revealed to me, I apparently need to know eight elements to be certain about the last.
Is there any way to rephrase the quote above which makes it true (are diagonal matrices an exception?). I know for instance, that if I have the following six elements:
$$ left[ beginarrayrrrr
a & b & c \
d & e & f \
. & . & . endarrayright] $$
I can deduce the last row by using the cross product of the first two (and choose a sign based on handedness).
Can I get away with less than these six elements?
matrices rotations
asked Aug 1 at 13:53
bgp2000
1163
1163
1
I think you are reading the statement backwards. With 3 elements known, the remaining 6 can be computed from $R^intercal R = boldsymbol1$. So given three elements like $beginpmatrix a & b & c \ cdot & cdot & cdot \ cdot & cdot & cdot endpmatrix$ the rest can be computed.
– ja72
Aug 1 at 17:53
I don't think you can say that "the rest can be computed". In your example you have given me one axis out of three. In the second row I can select any vector orthogonal to the first and the third row can be computed. I will end up with different rotations depending on which choice I make.
– bgp2000
Aug 1 at 18:48
I didn't say uniquely computed. You can always find two axes that are orthogonal to $(a,b,c)$ and to each other.
– ja72
Aug 1 at 19:44
add a comment |Â
1
I think you are reading the statement backwards. With 3 elements known, the remaining 6 can be computed from $R^intercal R = boldsymbol1$. So given three elements like $beginpmatrix a & b & c \ cdot & cdot & cdot \ cdot & cdot & cdot endpmatrix$ the rest can be computed.
– ja72
Aug 1 at 17:53
I don't think you can say that "the rest can be computed". In your example you have given me one axis out of three. In the second row I can select any vector orthogonal to the first and the third row can be computed. I will end up with different rotations depending on which choice I make.
– bgp2000
Aug 1 at 18:48
I didn't say uniquely computed. You can always find two axes that are orthogonal to $(a,b,c)$ and to each other.
– ja72
Aug 1 at 19:44
1
1
I think you are reading the statement backwards. With 3 elements known, the remaining 6 can be computed from $R^intercal R = boldsymbol1$. So given three elements like $beginpmatrix a & b & c \ cdot & cdot & cdot \ cdot & cdot & cdot endpmatrix$ the rest can be computed.
– ja72
Aug 1 at 17:53
I think you are reading the statement backwards. With 3 elements known, the remaining 6 can be computed from $R^intercal R = boldsymbol1$. So given three elements like $beginpmatrix a & b & c \ cdot & cdot & cdot \ cdot & cdot & cdot endpmatrix$ the rest can be computed.
– ja72
Aug 1 at 17:53
I don't think you can say that "the rest can be computed". In your example you have given me one axis out of three. In the second row I can select any vector orthogonal to the first and the third row can be computed. I will end up with different rotations depending on which choice I make.
– bgp2000
Aug 1 at 18:48
I don't think you can say that "the rest can be computed". In your example you have given me one axis out of three. In the second row I can select any vector orthogonal to the first and the third row can be computed. I will end up with different rotations depending on which choice I make.
– bgp2000
Aug 1 at 18:48
I didn't say uniquely computed. You can always find two axes that are orthogonal to $(a,b,c)$ and to each other.
– ja72
Aug 1 at 19:44
I didn't say uniquely computed. You can always find two axes that are orthogonal to $(a,b,c)$ and to each other.
– ja72
Aug 1 at 19:44
add a comment |Â
3 Answers
3
active
oldest
votes
up vote
2
down vote
Well, the way to best think about it is probably the following: the group of rotations $SO(n)$ is a manifold of dimension $fracn(n-1)2$ for every positive integer $n$. This means you can construct local diffeomorphisms from $SO(n)$ to $mathbbR^fracn(n-1)2$, which are in particular bijective maps. For $n=3$, we have $fracn(n-1)2 = 3$, which means that (locally on $SO(3)$!) every rotation matrix is determined uniquely by 3 real numbers. This is what is meant by 3 degrees of freedom. Does this mean you can compute a unique rotation matrix given 3 real numbers? No, it doesn't, but if you have three real numbers and a diffeomorphism as above you can. However, this matrix will only be unique on the domain of the diffeomorphism.
See also https://en.wikipedia.org/wiki/Charts_on_SO(3) for details.
1
Could you explain how you determined there are $fracn(n-1)2$ degrees of freedom for $SO(n)$ in general? A while back I attempted that derivation and got the result shown here. Maybe I was wrong? Our equations agree for the n=2 and n=3 cases.
– jnez71
Aug 1 at 15:35
The easiest way is to use the regular value theorem for the defining equations, i.e. $A^tA = 1$ and $det A =1$, see e.g. planetmath.org/dimensionofthespecialorthogonalgroup
– Distracted Kerl
Aug 1 at 21:49
I just noticed that our answers are equivalent (whoops, so obvious and yet we both missed it). $$n^2 - fracn(n-1)2 - n equiv fracn(n-1)2$$ I'd appreciate it if you undid your downvote on my other answer since it is in agreement with you.
– jnez71
Aug 1 at 23:19
add a comment |Â
up vote
1
down vote
The degrees of freedom (DOF) refers to continuous variables. For example, the rotations in two dimensions are determined by $ cos(theta)^2 + sin(theta)^2 = 1 $ which has one DOF. However, if given $ cos(theta) $ then all we can tell is that the other parameter is $ sin(theta) $ or $ -sin(theta). $ In general, if $ p(x) $ is a polynomial of degree $ n $, then $ p(x) = 0 $ has no DOF and still has $ n $ roots counted up to multiplicity.
Your last question about using less than six elements is yes with the proviso, as in the 2D rotation case, that the remaining values are not uniquely determined. Each row and column of a 3D rotation matrix has norm one. This gives two DOF. Also any two rows or two columns are orthogonal which reduces the DOF by one. The end result is, as you wrote, that there are three DOF in a rotation matrix.
add a comment |Â
up vote
1
down vote
The statement is literally true in the sense that a set of 6 elements can be computed when only three are known. That might not be a unique solution, but a solution.
Here is an example. I totally picked at random the known 3 elements of the following matrix
$$ mathrmR = left[ matrix a & 0.87 & b \ c & d & -0.05 \ 0.14 & f & g right] $$
The remaining unknown elements $a$, $b$, $c$, $d$, $f$ and $g$ are calculated from
$$ mathrmR^intercal mathrmR = mathbf1 $$ or in expanded form $$ left[ matrix
a^2+c^2 + 0.0196 & 0.87a+c d+0.14 f & a b-0.05c + 0.14g \ cdots & d^2+f^2 + 0.7569 & 0.87 b - 0.05d+ f g \ cdots & cdots & b^2+g^2+0.0025
right] = left[ matrix1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1 right] $$
One of the solutions is
$$ mathrmR = left[ matrix -0.447500684015486 & 0.87 &0.206985839625979 \
0.883257118740444 & 0.466215467562293 & -0.05 \
0.14 & -0.160446682127340 & 0.977065433936913 right] $$
That is indeed true. It would be interesting to know what the resulting rotations have in common, if anything. On a (pedantic) side note: your example isn't quite valid since the determinant of R is -1. it should be positive
– bgp2000
Aug 1 at 18:41
For some of the variables, I chose a root arbitrarily, as in $+sqrtcdot$ vs. $-sqrtcdot$. I must have created a left handed coordinate system instead.
– ja72
Aug 1 at 19:43
@bgp2000 all rotation matrices representing the same rotation have in common the axis of rotation and the angle up to $2 pi n$ where $n$ is some natural number.
– Mauricio Cele Lopez Belon
Aug 1 at 22:33
@MauricioCeleLopezBelon Yes, but the solutions found in this answer don‘t represent the same rotation.
– bgp2000
Aug 1 at 22:36
add a comment |Â
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
Well, the way to best think about it is probably the following: the group of rotations $SO(n)$ is a manifold of dimension $fracn(n-1)2$ for every positive integer $n$. This means you can construct local diffeomorphisms from $SO(n)$ to $mathbbR^fracn(n-1)2$, which are in particular bijective maps. For $n=3$, we have $fracn(n-1)2 = 3$, which means that (locally on $SO(3)$!) every rotation matrix is determined uniquely by 3 real numbers. This is what is meant by 3 degrees of freedom. Does this mean you can compute a unique rotation matrix given 3 real numbers? No, it doesn't, but if you have three real numbers and a diffeomorphism as above you can. However, this matrix will only be unique on the domain of the diffeomorphism.
See also https://en.wikipedia.org/wiki/Charts_on_SO(3) for details.
1
Could you explain how you determined there are $fracn(n-1)2$ degrees of freedom for $SO(n)$ in general? A while back I attempted that derivation and got the result shown here. Maybe I was wrong? Our equations agree for the n=2 and n=3 cases.
– jnez71
Aug 1 at 15:35
The easiest way is to use the regular value theorem for the defining equations, i.e. $A^tA = 1$ and $det A =1$, see e.g. planetmath.org/dimensionofthespecialorthogonalgroup
– Distracted Kerl
Aug 1 at 21:49
I just noticed that our answers are equivalent (whoops, so obvious and yet we both missed it). $$n^2 - fracn(n-1)2 - n equiv fracn(n-1)2$$ I'd appreciate it if you undid your downvote on my other answer since it is in agreement with you.
– jnez71
Aug 1 at 23:19
add a comment |Â
up vote
2
down vote
Well, the way to best think about it is probably the following: the group of rotations $SO(n)$ is a manifold of dimension $fracn(n-1)2$ for every positive integer $n$. This means you can construct local diffeomorphisms from $SO(n)$ to $mathbbR^fracn(n-1)2$, which are in particular bijective maps. For $n=3$, we have $fracn(n-1)2 = 3$, which means that (locally on $SO(3)$!) every rotation matrix is determined uniquely by 3 real numbers. This is what is meant by 3 degrees of freedom. Does this mean you can compute a unique rotation matrix given 3 real numbers? No, it doesn't, but if you have three real numbers and a diffeomorphism as above you can. However, this matrix will only be unique on the domain of the diffeomorphism.
See also https://en.wikipedia.org/wiki/Charts_on_SO(3) for details.
1
Could you explain how you determined there are $fracn(n-1)2$ degrees of freedom for $SO(n)$ in general? A while back I attempted that derivation and got the result shown here. Maybe I was wrong? Our equations agree for the n=2 and n=3 cases.
– jnez71
Aug 1 at 15:35
The easiest way is to use the regular value theorem for the defining equations, i.e. $A^tA = 1$ and $det A =1$, see e.g. planetmath.org/dimensionofthespecialorthogonalgroup
– Distracted Kerl
Aug 1 at 21:49
I just noticed that our answers are equivalent (whoops, so obvious and yet we both missed it). $$n^2 - fracn(n-1)2 - n equiv fracn(n-1)2$$ I'd appreciate it if you undid your downvote on my other answer since it is in agreement with you.
– jnez71
Aug 1 at 23:19
add a comment |Â
up vote
2
down vote
up vote
2
down vote
Well, the way to best think about it is probably the following: the group of rotations $SO(n)$ is a manifold of dimension $fracn(n-1)2$ for every positive integer $n$. This means you can construct local diffeomorphisms from $SO(n)$ to $mathbbR^fracn(n-1)2$, which are in particular bijective maps. For $n=3$, we have $fracn(n-1)2 = 3$, which means that (locally on $SO(3)$!) every rotation matrix is determined uniquely by 3 real numbers. This is what is meant by 3 degrees of freedom. Does this mean you can compute a unique rotation matrix given 3 real numbers? No, it doesn't, but if you have three real numbers and a diffeomorphism as above you can. However, this matrix will only be unique on the domain of the diffeomorphism.
See also https://en.wikipedia.org/wiki/Charts_on_SO(3) for details.
Well, the way to best think about it is probably the following: the group of rotations $SO(n)$ is a manifold of dimension $fracn(n-1)2$ for every positive integer $n$. This means you can construct local diffeomorphisms from $SO(n)$ to $mathbbR^fracn(n-1)2$, which are in particular bijective maps. For $n=3$, we have $fracn(n-1)2 = 3$, which means that (locally on $SO(3)$!) every rotation matrix is determined uniquely by 3 real numbers. This is what is meant by 3 degrees of freedom. Does this mean you can compute a unique rotation matrix given 3 real numbers? No, it doesn't, but if you have three real numbers and a diffeomorphism as above you can. However, this matrix will only be unique on the domain of the diffeomorphism.
See also https://en.wikipedia.org/wiki/Charts_on_SO(3) for details.
answered Aug 1 at 15:03


Distracted Kerl
192113
192113
1
Could you explain how you determined there are $fracn(n-1)2$ degrees of freedom for $SO(n)$ in general? A while back I attempted that derivation and got the result shown here. Maybe I was wrong? Our equations agree for the n=2 and n=3 cases.
– jnez71
Aug 1 at 15:35
The easiest way is to use the regular value theorem for the defining equations, i.e. $A^tA = 1$ and $det A =1$, see e.g. planetmath.org/dimensionofthespecialorthogonalgroup
– Distracted Kerl
Aug 1 at 21:49
I just noticed that our answers are equivalent (whoops, so obvious and yet we both missed it). $$n^2 - fracn(n-1)2 - n equiv fracn(n-1)2$$ I'd appreciate it if you undid your downvote on my other answer since it is in agreement with you.
– jnez71
Aug 1 at 23:19
add a comment |Â
1
Could you explain how you determined there are $fracn(n-1)2$ degrees of freedom for $SO(n)$ in general? A while back I attempted that derivation and got the result shown here. Maybe I was wrong? Our equations agree for the n=2 and n=3 cases.
– jnez71
Aug 1 at 15:35
The easiest way is to use the regular value theorem for the defining equations, i.e. $A^tA = 1$ and $det A =1$, see e.g. planetmath.org/dimensionofthespecialorthogonalgroup
– Distracted Kerl
Aug 1 at 21:49
I just noticed that our answers are equivalent (whoops, so obvious and yet we both missed it). $$n^2 - fracn(n-1)2 - n equiv fracn(n-1)2$$ I'd appreciate it if you undid your downvote on my other answer since it is in agreement with you.
– jnez71
Aug 1 at 23:19
1
1
Could you explain how you determined there are $fracn(n-1)2$ degrees of freedom for $SO(n)$ in general? A while back I attempted that derivation and got the result shown here. Maybe I was wrong? Our equations agree for the n=2 and n=3 cases.
– jnez71
Aug 1 at 15:35
Could you explain how you determined there are $fracn(n-1)2$ degrees of freedom for $SO(n)$ in general? A while back I attempted that derivation and got the result shown here. Maybe I was wrong? Our equations agree for the n=2 and n=3 cases.
– jnez71
Aug 1 at 15:35
The easiest way is to use the regular value theorem for the defining equations, i.e. $A^tA = 1$ and $det A =1$, see e.g. planetmath.org/dimensionofthespecialorthogonalgroup
– Distracted Kerl
Aug 1 at 21:49
The easiest way is to use the regular value theorem for the defining equations, i.e. $A^tA = 1$ and $det A =1$, see e.g. planetmath.org/dimensionofthespecialorthogonalgroup
– Distracted Kerl
Aug 1 at 21:49
I just noticed that our answers are equivalent (whoops, so obvious and yet we both missed it). $$n^2 - fracn(n-1)2 - n equiv fracn(n-1)2$$ I'd appreciate it if you undid your downvote on my other answer since it is in agreement with you.
– jnez71
Aug 1 at 23:19
I just noticed that our answers are equivalent (whoops, so obvious and yet we both missed it). $$n^2 - fracn(n-1)2 - n equiv fracn(n-1)2$$ I'd appreciate it if you undid your downvote on my other answer since it is in agreement with you.
– jnez71
Aug 1 at 23:19
add a comment |Â
up vote
1
down vote
The degrees of freedom (DOF) refers to continuous variables. For example, the rotations in two dimensions are determined by $ cos(theta)^2 + sin(theta)^2 = 1 $ which has one DOF. However, if given $ cos(theta) $ then all we can tell is that the other parameter is $ sin(theta) $ or $ -sin(theta). $ In general, if $ p(x) $ is a polynomial of degree $ n $, then $ p(x) = 0 $ has no DOF and still has $ n $ roots counted up to multiplicity.
Your last question about using less than six elements is yes with the proviso, as in the 2D rotation case, that the remaining values are not uniquely determined. Each row and column of a 3D rotation matrix has norm one. This gives two DOF. Also any two rows or two columns are orthogonal which reduces the DOF by one. The end result is, as you wrote, that there are three DOF in a rotation matrix.
add a comment |Â
up vote
1
down vote
The degrees of freedom (DOF) refers to continuous variables. For example, the rotations in two dimensions are determined by $ cos(theta)^2 + sin(theta)^2 = 1 $ which has one DOF. However, if given $ cos(theta) $ then all we can tell is that the other parameter is $ sin(theta) $ or $ -sin(theta). $ In general, if $ p(x) $ is a polynomial of degree $ n $, then $ p(x) = 0 $ has no DOF and still has $ n $ roots counted up to multiplicity.
Your last question about using less than six elements is yes with the proviso, as in the 2D rotation case, that the remaining values are not uniquely determined. Each row and column of a 3D rotation matrix has norm one. This gives two DOF. Also any two rows or two columns are orthogonal which reduces the DOF by one. The end result is, as you wrote, that there are three DOF in a rotation matrix.
add a comment |Â
up vote
1
down vote
up vote
1
down vote
The degrees of freedom (DOF) refers to continuous variables. For example, the rotations in two dimensions are determined by $ cos(theta)^2 + sin(theta)^2 = 1 $ which has one DOF. However, if given $ cos(theta) $ then all we can tell is that the other parameter is $ sin(theta) $ or $ -sin(theta). $ In general, if $ p(x) $ is a polynomial of degree $ n $, then $ p(x) = 0 $ has no DOF and still has $ n $ roots counted up to multiplicity.
Your last question about using less than six elements is yes with the proviso, as in the 2D rotation case, that the remaining values are not uniquely determined. Each row and column of a 3D rotation matrix has norm one. This gives two DOF. Also any two rows or two columns are orthogonal which reduces the DOF by one. The end result is, as you wrote, that there are three DOF in a rotation matrix.
The degrees of freedom (DOF) refers to continuous variables. For example, the rotations in two dimensions are determined by $ cos(theta)^2 + sin(theta)^2 = 1 $ which has one DOF. However, if given $ cos(theta) $ then all we can tell is that the other parameter is $ sin(theta) $ or $ -sin(theta). $ In general, if $ p(x) $ is a polynomial of degree $ n $, then $ p(x) = 0 $ has no DOF and still has $ n $ roots counted up to multiplicity.
Your last question about using less than six elements is yes with the proviso, as in the 2D rotation case, that the remaining values are not uniquely determined. Each row and column of a 3D rotation matrix has norm one. This gives two DOF. Also any two rows or two columns are orthogonal which reduces the DOF by one. The end result is, as you wrote, that there are three DOF in a rotation matrix.
edited Aug 1 at 16:41
answered Aug 1 at 14:50


Somos
10.9k1831
10.9k1831
add a comment |Â
add a comment |Â
up vote
1
down vote
The statement is literally true in the sense that a set of 6 elements can be computed when only three are known. That might not be a unique solution, but a solution.
Here is an example. I totally picked at random the known 3 elements of the following matrix
$$ mathrmR = left[ matrix a & 0.87 & b \ c & d & -0.05 \ 0.14 & f & g right] $$
The remaining unknown elements $a$, $b$, $c$, $d$, $f$ and $g$ are calculated from
$$ mathrmR^intercal mathrmR = mathbf1 $$ or in expanded form $$ left[ matrix
a^2+c^2 + 0.0196 & 0.87a+c d+0.14 f & a b-0.05c + 0.14g \ cdots & d^2+f^2 + 0.7569 & 0.87 b - 0.05d+ f g \ cdots & cdots & b^2+g^2+0.0025
right] = left[ matrix1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1 right] $$
One of the solutions is
$$ mathrmR = left[ matrix -0.447500684015486 & 0.87 &0.206985839625979 \
0.883257118740444 & 0.466215467562293 & -0.05 \
0.14 & -0.160446682127340 & 0.977065433936913 right] $$
That is indeed true. It would be interesting to know what the resulting rotations have in common, if anything. On a (pedantic) side note: your example isn't quite valid since the determinant of R is -1. it should be positive
– bgp2000
Aug 1 at 18:41
For some of the variables, I chose a root arbitrarily, as in $+sqrtcdot$ vs. $-sqrtcdot$. I must have created a left handed coordinate system instead.
– ja72
Aug 1 at 19:43
@bgp2000 all rotation matrices representing the same rotation have in common the axis of rotation and the angle up to $2 pi n$ where $n$ is some natural number.
– Mauricio Cele Lopez Belon
Aug 1 at 22:33
@MauricioCeleLopezBelon Yes, but the solutions found in this answer don‘t represent the same rotation.
– bgp2000
Aug 1 at 22:36
add a comment |Â
up vote
1
down vote
The statement is literally true in the sense that a set of 6 elements can be computed when only three are known. That might not be a unique solution, but a solution.
Here is an example. I totally picked at random the known 3 elements of the following matrix
$$ mathrmR = left[ matrix a & 0.87 & b \ c & d & -0.05 \ 0.14 & f & g right] $$
The remaining unknown elements $a$, $b$, $c$, $d$, $f$ and $g$ are calculated from
$$ mathrmR^intercal mathrmR = mathbf1 $$ or in expanded form $$ left[ matrix
a^2+c^2 + 0.0196 & 0.87a+c d+0.14 f & a b-0.05c + 0.14g \ cdots & d^2+f^2 + 0.7569 & 0.87 b - 0.05d+ f g \ cdots & cdots & b^2+g^2+0.0025
right] = left[ matrix1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1 right] $$
One of the solutions is
$$ mathrmR = left[ matrix -0.447500684015486 & 0.87 &0.206985839625979 \
0.883257118740444 & 0.466215467562293 & -0.05 \
0.14 & -0.160446682127340 & 0.977065433936913 right] $$
That is indeed true. It would be interesting to know what the resulting rotations have in common, if anything. On a (pedantic) side note: your example isn't quite valid since the determinant of R is -1. it should be positive
– bgp2000
Aug 1 at 18:41
For some of the variables, I chose a root arbitrarily, as in $+sqrtcdot$ vs. $-sqrtcdot$. I must have created a left handed coordinate system instead.
– ja72
Aug 1 at 19:43
@bgp2000 all rotation matrices representing the same rotation have in common the axis of rotation and the angle up to $2 pi n$ where $n$ is some natural number.
– Mauricio Cele Lopez Belon
Aug 1 at 22:33
@MauricioCeleLopezBelon Yes, but the solutions found in this answer don‘t represent the same rotation.
– bgp2000
Aug 1 at 22:36
add a comment |Â
up vote
1
down vote
up vote
1
down vote
The statement is literally true in the sense that a set of 6 elements can be computed when only three are known. That might not be a unique solution, but a solution.
Here is an example. I totally picked at random the known 3 elements of the following matrix
$$ mathrmR = left[ matrix a & 0.87 & b \ c & d & -0.05 \ 0.14 & f & g right] $$
The remaining unknown elements $a$, $b$, $c$, $d$, $f$ and $g$ are calculated from
$$ mathrmR^intercal mathrmR = mathbf1 $$ or in expanded form $$ left[ matrix
a^2+c^2 + 0.0196 & 0.87a+c d+0.14 f & a b-0.05c + 0.14g \ cdots & d^2+f^2 + 0.7569 & 0.87 b - 0.05d+ f g \ cdots & cdots & b^2+g^2+0.0025
right] = left[ matrix1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1 right] $$
One of the solutions is
$$ mathrmR = left[ matrix -0.447500684015486 & 0.87 &0.206985839625979 \
0.883257118740444 & 0.466215467562293 & -0.05 \
0.14 & -0.160446682127340 & 0.977065433936913 right] $$
The statement is literally true in the sense that a set of 6 elements can be computed when only three are known. That might not be a unique solution, but a solution.
Here is an example. I totally picked at random the known 3 elements of the following matrix
$$ mathrmR = left[ matrix a & 0.87 & b \ c & d & -0.05 \ 0.14 & f & g right] $$
The remaining unknown elements $a$, $b$, $c$, $d$, $f$ and $g$ are calculated from
$$ mathrmR^intercal mathrmR = mathbf1 $$ or in expanded form $$ left[ matrix
a^2+c^2 + 0.0196 & 0.87a+c d+0.14 f & a b-0.05c + 0.14g \ cdots & d^2+f^2 + 0.7569 & 0.87 b - 0.05d+ f g \ cdots & cdots & b^2+g^2+0.0025
right] = left[ matrix1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1 right] $$
One of the solutions is
$$ mathrmR = left[ matrix -0.447500684015486 & 0.87 &0.206985839625979 \
0.883257118740444 & 0.466215467562293 & -0.05 \
0.14 & -0.160446682127340 & 0.977065433936913 right] $$
answered Aug 1 at 18:21


ja72
7,17211641
7,17211641
That is indeed true. It would be interesting to know what the resulting rotations have in common, if anything. On a (pedantic) side note: your example isn't quite valid since the determinant of R is -1. it should be positive
– bgp2000
Aug 1 at 18:41
For some of the variables, I chose a root arbitrarily, as in $+sqrtcdot$ vs. $-sqrtcdot$. I must have created a left handed coordinate system instead.
– ja72
Aug 1 at 19:43
@bgp2000 all rotation matrices representing the same rotation have in common the axis of rotation and the angle up to $2 pi n$ where $n$ is some natural number.
– Mauricio Cele Lopez Belon
Aug 1 at 22:33
@MauricioCeleLopezBelon Yes, but the solutions found in this answer don‘t represent the same rotation.
– bgp2000
Aug 1 at 22:36
add a comment |Â
That is indeed true. It would be interesting to know what the resulting rotations have in common, if anything. On a (pedantic) side note: your example isn't quite valid since the determinant of R is -1. it should be positive
– bgp2000
Aug 1 at 18:41
For some of the variables, I chose a root arbitrarily, as in $+sqrtcdot$ vs. $-sqrtcdot$. I must have created a left handed coordinate system instead.
– ja72
Aug 1 at 19:43
@bgp2000 all rotation matrices representing the same rotation have in common the axis of rotation and the angle up to $2 pi n$ where $n$ is some natural number.
– Mauricio Cele Lopez Belon
Aug 1 at 22:33
@MauricioCeleLopezBelon Yes, but the solutions found in this answer don‘t represent the same rotation.
– bgp2000
Aug 1 at 22:36
That is indeed true. It would be interesting to know what the resulting rotations have in common, if anything. On a (pedantic) side note: your example isn't quite valid since the determinant of R is -1. it should be positive
– bgp2000
Aug 1 at 18:41
That is indeed true. It would be interesting to know what the resulting rotations have in common, if anything. On a (pedantic) side note: your example isn't quite valid since the determinant of R is -1. it should be positive
– bgp2000
Aug 1 at 18:41
For some of the variables, I chose a root arbitrarily, as in $+sqrtcdot$ vs. $-sqrtcdot$. I must have created a left handed coordinate system instead.
– ja72
Aug 1 at 19:43
For some of the variables, I chose a root arbitrarily, as in $+sqrtcdot$ vs. $-sqrtcdot$. I must have created a left handed coordinate system instead.
– ja72
Aug 1 at 19:43
@bgp2000 all rotation matrices representing the same rotation have in common the axis of rotation and the angle up to $2 pi n$ where $n$ is some natural number.
– Mauricio Cele Lopez Belon
Aug 1 at 22:33
@bgp2000 all rotation matrices representing the same rotation have in common the axis of rotation and the angle up to $2 pi n$ where $n$ is some natural number.
– Mauricio Cele Lopez Belon
Aug 1 at 22:33
@MauricioCeleLopezBelon Yes, but the solutions found in this answer don‘t represent the same rotation.
– bgp2000
Aug 1 at 22:36
@MauricioCeleLopezBelon Yes, but the solutions found in this answer don‘t represent the same rotation.
– bgp2000
Aug 1 at 22:36
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2869093%2fhow-many-elements-of-a-3x3-rotation-matrix-are-redundant%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
I think you are reading the statement backwards. With 3 elements known, the remaining 6 can be computed from $R^intercal R = boldsymbol1$. So given three elements like $beginpmatrix a & b & c \ cdot & cdot & cdot \ cdot & cdot & cdot endpmatrix$ the rest can be computed.
– ja72
Aug 1 at 17:53
I don't think you can say that "the rest can be computed". In your example you have given me one axis out of three. In the second row I can select any vector orthogonal to the first and the third row can be computed. I will end up with different rotations depending on which choice I make.
– bgp2000
Aug 1 at 18:48
I didn't say uniquely computed. You can always find two axes that are orthogonal to $(a,b,c)$ and to each other.
– ja72
Aug 1 at 19:44