Difference between revisions of "Entropy"

From Conservapedia
Jump to: navigation, search
m (Entropy in Quantum Information Theory)
Line 1: Line 1:
Entropy is a measure of disorder or information content in a system, first postulated by [[Lazare Carnot]] in 1803.  
+
Entropy is the thermodynamic property which trends toward equilibrium, first postulated by [[Lazare Carnot]] in 1803.  
  
 
The [[second law of thermodynamics]] states that entropy will never decrease over time within a [[closed system]], defining a closed system as one in which neither matter nor energy may enter or leave.
 
The [[second law of thermodynamics]] states that entropy will never decrease over time within a [[closed system]], defining a closed system as one in which neither matter nor energy may enter or leave.

Revision as of 03:13, 31 January 2013

Entropy is the thermodynamic property which trends toward equilibrium, first postulated by Lazare Carnot in 1803.

The second law of thermodynamics states that entropy will never decrease over time within a closed system, defining a closed system as one in which neither matter nor energy may enter or leave.

Entropy is undeniable and yet creates perhaps insurmountable difficulties for many modern theories of physics. For example, it renders time asymmetric, resulting in an arrow of time that is difficult to reconcile with the theory of relativity. Entropy casts doubt on whether physical laws or the speed of light are invariant and perpetual.

Definitions

Thermodynamic definition

In classical thermodynamics, if a small amount of energy dQ is supplied to a system from a reservoir held at temperature T, the change in entropy is given by

For a measurable change between two states i and f this expression integrates to

Statistical mechanics definition 1 (Boltzmann Entropy)

If a system can be arranged in W different ways, the entropy is

where is Boltzmann's constant.

Statistical mechanics definition 2 (Gibbs Entropy)

Label the different states a thermodynamic system can be in by . If the probability of finding the system in state i is , then the entropy is

where is the Bolzmann constant. This definition is closely related to ideas in information theory, where the definition of information content is very similar to the definition of entropy.

Entropy in information Theory (Shannon Entropy)

For a discrete random variable, entropy is defined as

For a continuous random variable, the analogous description for entropy, which in this case represents the number of bits necessary to quantize a signal to a desired accuracy, is given by

Entropy in Quantum Information Theory (Von Neumann Entropy)

In entangled systems, a useful quantity is the Von Neumann Entropy, defined (for a system with density matrix by

where Tr() indicates taking the Trace of a matrix (the sum of the diagonal elements). This is a useful measure of entanglement, which is zero for a pure state, and maximal for a fully mixed state.

See also

References