Tensors are mathematical objects consisting of indices and components, which obey rules of transformation. Tensors, and tensor fields, are useful in mechanical engineering, electromagnetic theory, differential geometry, and the general theory of relativity. The study of tensors is variously called "tensor algebra", "tensor calculus", or "tensor analysis".
Briefly, a tensor is some kind of linear function involving vectors. The "rank" of a tensor indicates how many vectors are involved as arguments to that function. A simple number (that is, a "scalar") is a function of no arguments and just returns a number, and is a "zeroth rank" tensor. (Lest the reader think that scalars are utterly trivial, consider that "scalar fields", "vector fields", and "tensor fields" are assignments of scalars, vectors and tensors at every point in space, and that the gradient of a scalar field is a first-rank covariant tensor field.)
Tensors may be "covariant", "contravariant", or "mixed", depending on just how they deal with their arguments. They can usually be converted into each other. A vector is a first rank contravariant tensor. A linear function that maps an input vector to an output vector is a second rank mixed tensor. (Such a thing is often called a matrix, but this glosses over a distinction between a thing and the numbers that describe it.)
Tensors are often defined via their transformation properties, that is, by how their components change when one rotates the coordinate axes. Suppose we have a set of numbers , and we want to know how their values change under rotation of Cartesian axes. If the values in the new co-ordinate system can be written
where are the elements of the rotation matrix then the are said to be the components of a first rank contravariant tensor, that is, a vector. Similarly, the components of a second rank contravariant tensor satisfy
and for higher order tensors, we just keep adding more of the rotation matrices, or their inverses for covariant tensors.