Summary: The one that took four days. A detector that worked finally arrived for the experiment, so work on EM II has been somewhat slower. Also, the example here uses a lot of material from prior examples and requires being on your toes. This example is all about showing that a rather abstruse looking rotation matrix is in fact a rotation matrix. It involves recognizing dot and cross products when they're written in tensor index notation and having rock solid index skills. At the end of the day though, it's pretty cool, but it still seems like there should be an even simpler way to do this than the one shown here.

The game is to show that the following is a rotation matrix in that when multiplied by its transpose, the result is the identity matrix:

$M_{ij} = \delta_{ij}cos \alpha + n_i n_j \left(1 - cos \alpha\right) + \epsilon_{ijk}n_k sin \alpha$

Keep in mind that $n_i$ is defined to be a unit vector. The transpose relation that we're supposed to show can be written down as $M_{ij}M_{ik} = \delta_{jk}$

Multiplying the matrix by itself will resullt in 9 terms. We can make two of them go away identically and two others subtract from each other to disappear.

We'll make one change to the index just for clarities sake and rename $M_{ik}$ to $M_{il}$, and when referring to the k component in the $M_{il}$ matrix, it will be called m.

The nine terms are

$\delta_{ij}\delta_{il}cos^2\alpha = \delta_{jl}cos^2\alpha$

For this term, keep in mind that you're really summing three 3x3 matrices. Only the $j=l$ terms survive, but they survive in three different locations in index space: 1, 1; 2, 2; and 3, 3.

2

$\delta_{il}n_in_j cos\alpha\left(1-cos\alpha\right)$

$= n_ln_j cos\alpha\left(1 - cos\alpha\right)$

The trick here was to absorb the $\delta_{il}$ into the $n_i$, changing the index of $n_i$ to $n_l$ in the process.

3

$\delta_{il}\epsilon_{ijk}n_k sin\alpha cos\alpha$

$=\epsilon_{ljk}n_k\space sin\alpha\space cos\alpha$

The trick in this first step was to apply the $\delta$ to the $\epsilon$ and switch the appropriate index name as above. There's one more trick that can be played and we'll see it when we get to the 7th term. For now, suffice it to say that this term is going to go away in the end.

4

$\delta_{ij} n_i n_l cos\alpha\left(1 - cos\alpha\right)$

First, keep in mind that we're still multiplying the $il$ terms times the $ij$ terms, they're just written out of order above. We can play the same game we played in term number two to get a final answer of

$n_j n_l cos\alpha \left(1 - cos\alpha\right)$

These two terms, 2 and 4 are the same, so we just wind up with 2 teims term 2.

5

$n_in_ln_in_j\left(1 - cos\alpha\right)^2$

$= n_l n_i \left(1 - cos\alpha\right)^2$

The trick here is to group the two $n_i$ terms next to each other and recognize that group as the dot product of two unit vectors. Since they're unit vectors, the dot product evaluates to one.

6

$n_n n_l \epsilon_{ijk} n_k sin\alpha \left(1 - cos\alpha\right)$

$= 0$

The trick here is to spot the cross product of two unit vectors. Since the unit vector is of course identical to itself, it is also parallel to itself, so the cross product, $\epsilon_{ijk}n_i n_k$, disappears.

7

This is the one that winds up being identical to term three.

$\delta_{ij} \epsilon_{ilm} n_m sin\alpha cos alpha$

$= \epsilon_{jlm} n_m sin \alpha cos \alpha$

Here's the trick. The k from term 3, rewritten here for convenience:

$\epsilon_{ljk}n_k\space sin\alpha\space cos\alpha$,

and the m from term 7 are summation variables. Also note that while you're free to choose any values for $l$ and $j$ you like, they have to be the same values in term 3 and 7. So, when I run my sum through either k or m, by choosing a value for, let's say k, I fix the values of j and l, (they have to not be equal to the chose k or m to avoid making the Levi-Civita symbol zero). Since I've now fixed j and l, there's only one value of m that will be non-zero. Essentially, this makes k and m move in lock step with each other. Now, for the anti-symmetry trick. If I switch any two indices in a Levi-Civita symbol, I change it's sign. If you check, you'll see that the j and l are reversed in order between terms 3 and 7. This means that the two terms will always have opposite signs, but the same values, and will sum to 0. Terms 3 and 7 are removed from the rest of the process.

8

Term eight is similar to term six but with different indices. It is equal to 0 for the same reason having to do with the cross product of parallel vectors.

9

$\epsilon_{ijk} n_k \epsilon_{ilm} n_m sin^2 \alpha$

First, we need to rearrange the indices to make use of the identity that transforms a product of $\epsilon$s into a difference of products of $\delta$s.

$\epsilon_{ijk} n_k \epsilon_{ilm} n_m sin^2 \alpha = \epsilon_{jki} n_k \epsilon_{lmi} n_m sin^2 \alpha$

The operation above cycled the indices on the $\epsilon$s to make them look like the identity as recroded in the notes. Cycling the indices on a Levi-Civita symbol is symmetric and introduces no sign change. Once this is done, we can use the identity,

$\epsilon_{jki} \epsilon_{lmi} = \delta_{jl} \delta_{km} - \delta_{jm} \delta_{kl}$,

to turn term 9 into

$\left(\delta_{jl} \delta_{km}n_k n_m - \delta_{jm} \delta_{kl}n_k n_m\right) sin^2 \alpha$

I'll refer to the first subterm of 9 as 9a and to the second subterm as 9b. It's easy to get carried away and try to think or write out all the different combinations of indices in this one. The term can be evaluated much more quickly by turning off all original thinking and just using pre-existing tricks though. First, the 9a term contains a dot product, so,

$\delta_{jl} \delta_{km}n_k n_m sin^2 \alpha = \delta_{jl} sin^2 \alpha$

Second, 9b can be simplified mechanically by running the same, '$\delta_{ij}$ renames an index of what it's applied to' trick we've been running throughout. Here's the result,

$\delta_{jm} \delta_{kl}n_k n_m = n_l n_j sin^2 \alpha$

The final result is

$\delta_{jl} sin^2 \alpha - n_l n_j sin^2 \alpha$

The remaining five, or six terms, depending on how you count, are summed to give:

$\delta_{jl} cos^2 \alpha + \delta_{jl} sin^2 \alpha + n_j n_l cos \alpha \left(1 - cos \alpha \right)

+ n_l n_j \left(1 - cos \alpha \right)^2 + n_l n_j cos \alpha \left(1 - cos \alpha \right) - n_l n_j sin^2 \alpha$

The first two terms add, and the third and fifth terms just double to give,

$\delta_{jl} + 2 n_j n_l cos \alpha \left(1 - cos \alpha \right)

+ n_l n_j \left(1 - cos \alpha \right)^2 - n_l n_j sin^2 \alpha$

This leaves a few cosine terms to expand and evaluate

$2 cos\alpha - 2 cos^2 \alpha$

$-2 cos\alpha + cos^2 \alpha + 1$

This leaves us with $-cos^2 \alpha + 1 = sin^2 \alpha$

Plugging this back into the original sum of terms we get,

$\delta_{jl} + n_l n_j sin^2 \alpha - n_l n_j sin^2 \alpha = \delta_{jl}$

It's done!

Sunset over Sound Beach, NY.

The game is to show that the following is a rotation matrix in that when multiplied by its transpose, the result is the identity matrix:

$M_{ij} = \delta_{ij}cos \alpha + n_i n_j \left(1 - cos \alpha\right) + \epsilon_{ijk}n_k sin \alpha$

Keep in mind that $n_i$ is defined to be a unit vector. The transpose relation that we're supposed to show can be written down as $M_{ij}M_{ik} = \delta_{jk}$

Multiplying the matrix by itself will resullt in 9 terms. We can make two of them go away identically and two others subtract from each other to disappear.

We'll make one change to the index just for clarities sake and rename $M_{ik}$ to $M_{il}$, and when referring to the k component in the $M_{il}$ matrix, it will be called m.

The nine terms are

**1**$\delta_{ij}\delta_{il}cos^2\alpha = \delta_{jl}cos^2\alpha$

For this term, keep in mind that you're really summing three 3x3 matrices. Only the $j=l$ terms survive, but they survive in three different locations in index space: 1, 1; 2, 2; and 3, 3.

2

$\delta_{il}n_in_j cos\alpha\left(1-cos\alpha\right)$

$= n_ln_j cos\alpha\left(1 - cos\alpha\right)$

The trick here was to absorb the $\delta_{il}$ into the $n_i$, changing the index of $n_i$ to $n_l$ in the process.

3

$\delta_{il}\epsilon_{ijk}n_k sin\alpha cos\alpha$

$=\epsilon_{ljk}n_k\space sin\alpha\space cos\alpha$

The trick in this first step was to apply the $\delta$ to the $\epsilon$ and switch the appropriate index name as above. There's one more trick that can be played and we'll see it when we get to the 7th term. For now, suffice it to say that this term is going to go away in the end.

4

$\delta_{ij} n_i n_l cos\alpha\left(1 - cos\alpha\right)$

First, keep in mind that we're still multiplying the $il$ terms times the $ij$ terms, they're just written out of order above. We can play the same game we played in term number two to get a final answer of

$n_j n_l cos\alpha \left(1 - cos\alpha\right)$

These two terms, 2 and 4 are the same, so we just wind up with 2 teims term 2.

5

$n_in_ln_in_j\left(1 - cos\alpha\right)^2$

$= n_l n_i \left(1 - cos\alpha\right)^2$

The trick here is to group the two $n_i$ terms next to each other and recognize that group as the dot product of two unit vectors. Since they're unit vectors, the dot product evaluates to one.

6

$n_n n_l \epsilon_{ijk} n_k sin\alpha \left(1 - cos\alpha\right)$

$= 0$

The trick here is to spot the cross product of two unit vectors. Since the unit vector is of course identical to itself, it is also parallel to itself, so the cross product, $\epsilon_{ijk}n_i n_k$, disappears.

7

This is the one that winds up being identical to term three.

$\delta_{ij} \epsilon_{ilm} n_m sin\alpha cos alpha$

$= \epsilon_{jlm} n_m sin \alpha cos \alpha$

Here's the trick. The k from term 3, rewritten here for convenience:

$\epsilon_{ljk}n_k\space sin\alpha\space cos\alpha$,

and the m from term 7 are summation variables. Also note that while you're free to choose any values for $l$ and $j$ you like, they have to be the same values in term 3 and 7. So, when I run my sum through either k or m, by choosing a value for, let's say k, I fix the values of j and l, (they have to not be equal to the chose k or m to avoid making the Levi-Civita symbol zero). Since I've now fixed j and l, there's only one value of m that will be non-zero. Essentially, this makes k and m move in lock step with each other. Now, for the anti-symmetry trick. If I switch any two indices in a Levi-Civita symbol, I change it's sign. If you check, you'll see that the j and l are reversed in order between terms 3 and 7. This means that the two terms will always have opposite signs, but the same values, and will sum to 0. Terms 3 and 7 are removed from the rest of the process.

8

Term eight is similar to term six but with different indices. It is equal to 0 for the same reason having to do with the cross product of parallel vectors.

9

$\epsilon_{ijk} n_k \epsilon_{ilm} n_m sin^2 \alpha$

First, we need to rearrange the indices to make use of the identity that transforms a product of $\epsilon$s into a difference of products of $\delta$s.

$\epsilon_{ijk} n_k \epsilon_{ilm} n_m sin^2 \alpha = \epsilon_{jki} n_k \epsilon_{lmi} n_m sin^2 \alpha$

The operation above cycled the indices on the $\epsilon$s to make them look like the identity as recroded in the notes. Cycling the indices on a Levi-Civita symbol is symmetric and introduces no sign change. Once this is done, we can use the identity,

$\epsilon_{jki} \epsilon_{lmi} = \delta_{jl} \delta_{km} - \delta_{jm} \delta_{kl}$,

to turn term 9 into

$\left(\delta_{jl} \delta_{km}n_k n_m - \delta_{jm} \delta_{kl}n_k n_m\right) sin^2 \alpha$

I'll refer to the first subterm of 9 as 9a and to the second subterm as 9b. It's easy to get carried away and try to think or write out all the different combinations of indices in this one. The term can be evaluated much more quickly by turning off all original thinking and just using pre-existing tricks though. First, the 9a term contains a dot product, so,

$\delta_{jl} \delta_{km}n_k n_m sin^2 \alpha = \delta_{jl} sin^2 \alpha$

Second, 9b can be simplified mechanically by running the same, '$\delta_{ij}$ renames an index of what it's applied to' trick we've been running throughout. Here's the result,

$\delta_{jm} \delta_{kl}n_k n_m = n_l n_j sin^2 \alpha$

The final result is

$\delta_{jl} sin^2 \alpha - n_l n_j sin^2 \alpha$

**Summing the terms**The remaining five, or six terms, depending on how you count, are summed to give:

$\delta_{jl} cos^2 \alpha + \delta_{jl} sin^2 \alpha + n_j n_l cos \alpha \left(1 - cos \alpha \right)

+ n_l n_j \left(1 - cos \alpha \right)^2 + n_l n_j cos \alpha \left(1 - cos \alpha \right) - n_l n_j sin^2 \alpha$

The first two terms add, and the third and fifth terms just double to give,

$\delta_{jl} + 2 n_j n_l cos \alpha \left(1 - cos \alpha \right)

+ n_l n_j \left(1 - cos \alpha \right)^2 - n_l n_j sin^2 \alpha$

This leaves a few cosine terms to expand and evaluate

$2 cos\alpha - 2 cos^2 \alpha$

$-2 cos\alpha + cos^2 \alpha + 1$

This leaves us with $-cos^2 \alpha + 1 = sin^2 \alpha$

Plugging this back into the original sum of terms we get,

$\delta_{jl} + n_l n_j sin^2 \alpha - n_l n_j sin^2 \alpha = \delta_{jl}$

It's done!

**Picture of the Day**Sunset over Sound Beach, NY.

## Comments