Entropy is a measure of disorder or information content in a system, first postulated by Lazare Carnot in 1803.
Entropy is undeniable and yet creates perhaps insurmountable difficulties for many modern theories of physics. For example, it renders time asymmetric, resulting in an arrow of time that is difficult to reconcile with the theory of relativity. Entropy casts doubt on whether physical laws or the speed of light are invariant and perpetual.
In classical thermodynamics, if a small amount of energy dQ is supplied to a system from a reservoir held at temperature T, the change in entropy is given by
For a measurable change between two states i and f this expression integrates to
Statistical mechanics definition 1 (Boltzmann Entropy)
If a system can be arranged in W different ways, the entropy is
where is Boltzmann's constant.
Statistical mechanics definition 2 (Gibbs Entropy)
Label the different states a thermodynamic system can be in by . If the probability of finding the system in state i is , then the entropy is
where is the Bolzmann constant. This definition is closely related to ideas in information theory, where the definition of information content is very similar to the definition of entropy.
Entropy in information Theory (Shannon Entropy)
Entropy in Quantum Information Theory (Von Neumann Entropy)
In entangled systems, a useful quantity is the Von Neumann Entropy, defined (for a system with density matrix by
where Tr() indicates taking the Trace of a matrix (the sum of the diagonal elements). This is a useful measure of entanglement, which is zero for a pure state, and maximal for a fully mixed state.