Tuesday, June 5, 2018

tensors - Proving vector identity by Einstein summation



Given that $$N_{ij}=\delta_{ij}-\epsilon_{ijk}n_k+n_in_j$$
and $$M_{ij}=\delta_{ij}+\epsilon_{ijk}n_k$$

Show that $$N_{ij}M_{jk}=2\delta_{ik}$$ (where $n$ is a unit vector in $R^3$)



What I have tried:
$$M_{jk}=\delta_{jk}+\epsilon_{jkk}n_k=\delta_{jk}$$
So $$N_{ij}M_{jk}=(\delta_{ij}-\epsilon_{ijk}n_k+n_in_j)\delta_{jk}=\delta_{ik}-0+n_in_k$$



I'm not entirely sure what I did is legal under summation rules, but the terms I ended up doesn't match with the provided answer. Any hint is appreciated.



And what are the usual strategies when dealing with proving vector identities using suffix notation?


Answer




As far as computations go, spaceisdarkgreen's answer above is pretty ok for me. But I have some qualms with this exercise in general. Your mistake is one of the reasons why I prefer to write vector components with upper indices, and Einstein's convention is: if the same index appears twice, once up and once down, sum over it. If $n$ is a unit vector in $\Bbb R^3$, we'd write its components as $n^1$, $n^2$ and $n^3$. This calls for the index balance $$N^{ij} = \delta^{ij} - \epsilon_{\;\;k}^{ij}n^k + n^in^j \qquad \mbox{and} \qquad M^{ij} = \delta^{ij} + \epsilon_{\;\;k}^{ij}n^k.$$Then, as it is, the expression $N^{ij}M^{jk}$ makes no sense as the index $j$ appears twice up.



As it is written, $N^{ij}$ and $M^{ij}$ are components of tensors $M,N\colon (\Bbb R^3)^* \times (\Bbb R^3)^* \to \Bbb R$, in the standard basis of $\Bbb R^3$ (and its dual basis).



In order to make sense of the expression $N^{ij}M^{jk}$, we should work with the tensor $\widetilde{M}\colon \Bbb R^3 \times (\Bbb R^3)^* \to \Bbb R$, which is equivalent to $M$ under index lowering. Since (if?) these components are with respect with an orthonormal basis, and the scalar product is positive-definite, $\widetilde{M}$'s components are just $$M_i^{\;j} = \delta_i^{\;j} + \epsilon_{i\;k}^{\;j}n^k,$$and so the expression $N^{ij}M_{j}^{\;k}$ makes sense. So Einstein's convention comes with a built-in error detector. Notice that $i$ and $k$ are fixed indices. So they're off limits when we want to relabel mute indices. We then have: $$\begin{align}N^{ij}M_{j}^{\;k} &= (\delta^{ij} - \epsilon_{\;\;r}^{ij}n^r + n^in^j)(\delta_j^{\;k} + \epsilon_{j\;s}^{\;k}n^s) \\ &= \delta^{ij}\delta_j^{\;k} + \delta^{ij}\epsilon_{j\;s}^{\;k}n^s - \epsilon^{ij}_{\;\;r}n^r\delta_j^{\;k} - \epsilon^{ij}_{\;\;r}n^r\epsilon_{j\;s}^{\;k}n^s + n^in^j\delta_j^{\;k} + \epsilon_{j\;s}^{\;k}n^in^jn^s \\ &= \delta^{ik} + \delta^{ij}\epsilon_{j\;s}^{\;k}n^s - \epsilon^{ik}_{\;\;r}n^r - \color{blue}{(\epsilon^{ij}_{\;\;r}\epsilon_{j\;s}^{\;k})}n^rn^s + n^in^k + \color{red}{(\epsilon_{j\;s}^{\;k}n^jn^s)}n^i .\end{align}$$



To simplify the piece in blue, we use some well-known permutation identities: $$\epsilon^{ij}_{\;\;r}\epsilon_{j\;s}^{\;k} = -\epsilon^{ji}_{\;\;r}\epsilon_{j\;s}^{\;k} = -(\delta^{ik}\delta_{rs} - \delta^i_{\;s}\delta^k_{\;r}) = \delta^i_{\;s}\delta^k_{\;r}-\delta^{ik}\delta_{rs}. $$



Modulo sign, the piece in red is the $k$-th component of the cross product $n \times n$, which is zero.
$\require{cancel}$

So: $$\begin{align}N^{ij}M_{j}^{\;k} &= \delta^{ik} + \delta^{ij}\epsilon_{j\;s}^{\;k}n^s - \epsilon^{ik}_{\;\;r}n^r - \delta^i_{\;s}\delta^k_{\;r}n^rn^s+\delta^{ik}\delta_{rs}n^rn^s+n^in^k \\ &=\delta^{ik} + \delta^{ij}\epsilon_{j\;s}^{\;k}n^s - \epsilon^{ik}_{\;\;r}n^r - \cancel{n^kn^i}+\delta^{ik}\color{green}{\delta_{rs}n^rn^s}+\cancel{n^in^k} \\ &=2\delta^{ik} + \delta^{ij}\epsilon_{j\;s}^{\;k}n^s - \epsilon^{ik}_{\;\;r}n^r , \end{align}$$since the piece in green is nothing more than $n \cdot n = 1$.



Further simplification will violate Einstein's convention as I have stated it. Since we're in a good situation (as explained earlier), you can lower all indexes and use Einstein's convention as you're using there to simplify what is left, if you want to.






For readers who know portuguese, I happen to have written some material on tensors, it might be helpful.







I just realized that we actually can simplify further, using that $r$ in the last term is mute. We'll have $$\begin{align}N^{ij}M_{j}^{\;k} &= 2\delta^{ik} + \delta^{ij}\epsilon_{j\;s}^{\;k}n^s - \epsilon^{ik}_{\;\;r}n^r \\ &= 2\delta^{ik} + \cancel{\epsilon_{\;\;s}^{ik}n^s} - \cancel{\epsilon^{ik}_{\;\;s}n^s} \\ &= 2\delta^{ik}.\end{align}$$


No comments:

Post a Comment

analysis - Injection, making bijection

I have injection $f \colon A \rightarrow B$ and I want to get bijection. Can I just resting codomain to $f(A)$? I know that every function i...