# Tensor

**Tensors** are mathematical objects consisting of indices and components, which obey rules of transformation.^{[1]} Tensors, and tensor fields, are useful in mechanical engineering, electromagnetic theory, differential geometry, and the general theory of relativity. The study of tensors is variously called "tensor algebra", "tensor calculus", or "tensor analysis".

Briefly, a tensor is some kind of linear function involving vectors. The "rank" of a tensor indicates how many vectors are involved as arguments to that function. A simple number (that is, a "scalar") is a function of no arguments and just returns a number, and is called a "zeroth rank" tensor. (Lest the reader think that scalars are utterly trivial, consider that "scalar fields", "vector fields", and "tensor fields" are assignments of scalars, vectors and tensors at every point in space, and that the gradient of a scalar field is a first-rank, or vector covariant tensor field.)

Tensors may be "covariant", "contravariant", or "mixed", depending on just how they deal with their arguments. They can usually be converted into each other. A vector is a first rank contravariant tensor. A linear function that maps an input vector to an output vector (sometimes called a "linear transformation") is a second rank mixed tensor. (Such a thing is often called a matrix, but this glosses over a distinction between a thing and the numbers that describe it.)

Tensors are often defined via their transformation properties, that is, by how their components change when one rotates the coordinate axes. Suppose we have a set of numbers , and we want to know how their values change under rotation of Cartesian axes. If the values in the new co-ordinate system can be written as:

where are the elements of the rotation matrix then the are said to be the components of a first rank contravariant tensor, that is, a vector. Using the Einstein summation convention, the repeated index is summed over. This process of summing over a repeated index is known as "contraction", or in this case "contracting the j index". The components of a second rank contravariant tensor satisfy

and for higher order tensors, we just keep adding more of the rotation matrices, or their inverses for covariant tensors.

## Examples

### Kronecker Delta

The Kronecker delta is a second rank (it has two indices) tensor that acts as the identity matrix.^{[2]} It is denoted as and is equal to 1 if i=j and 0 otherwise. This means for any tensor, say , the Kronecker delta can contract with it as follows:

as all terms when i does not equal j are 0. In effect, it replaces one index with another, in this case j with i.

### Levi-Civita

The Levi-Civita tensor, also known as the permutation tensor, is a third rank tensor used in cross products.^{[3]} It is defined as:

A permutation is even if it contains each of 1, 2, 3 exactly once and can be rearranged into the form 123 by swapping numbers that are next to each other an even number of times (hence even permutation). An example is 231 as the 3 and 1 can be swapped to give 213 and then the 1 and 2 to produce 123 in an even number of swaps. Similarly, an odd permutation contains each of 1, 2, 3 exactly once and can be rearranged into the form 123 by swapping numbers that are next to each other an odd number of times. As the sign of the Levi-Civita tensor depends on whether ijk form an even or odd permutation, it is sometimes called the "totally antisymmetric tensor".

The Levi-Civita tensor is particularly useful when dealing with cross products. The cross product can be represented as:^{[2]}

The Levi-Civita tensor can be extended to higher dimensions by adding more indices and maintaining the definition of even and odd permutations above.