Page 1 of 1

Tensors - Part 1

Posted: Sun Dec 25, 2016 6:17 pm
by Tsakanikas Nickos
Definition 1: Let $V$ be a finite-dimensional $\mathbb{R}$-vector space and let $k$ be a non-negative integer. A covariant $k$-tensor is a multilinear function $T \ \colon V \times \dots \times V \to \mathbb{R} $.

Definition 2: A covariant $k$-tensor is called symmetric if its value is unchanged by interchanging any pair of its arguments:
\[ T(X_{1}, \dots, X_{i}, \dots, X_{j}, \dots, X_{k}) = T(X_{1}, \dots, X_{j}, \dots, X_{i}, \dots, X_{k}) \]whenever $ 1 \leq i < j \leq k $.


Show that the following are equivalent for a covariant $k$-tensor $T$:
  1. $T$ is symmetric.
  2. For any vectors $X_{1}, \dots, X_{k}$ the value of $T(X_{1}, \dots, X_{k})$ is unchanged when $ X_{1}, \dots, X_{k} $ are rearranged in any order.
  3. The components $T_{i_{1} \dots i_{k}} = T(E_{i_{1}}, \dots, E_{i_{k}})$ of $T$ with respect to any basis $ \left\{ E_{i} \right\} $ of $V$ are unchanged by any permutation of the indices.