Difference between revisions of "Entropy"

From Conservapedia
Jump to: navigation, search
(clean up & uniformity)
(See also)
 
(10 intermediate revisions by 3 users not shown)
Line 1: Line 1:
'''Entropy''' is the tendency of everything to trend toward greater disorder, in the absence of intelligent intervention.
+
'''Entropy''' is a quantitative measure of the "disorder" in a system. It forms the basis of the [[second law of thermodynamics]], that '''entropy''' tends to increase. In other words, the tendency of everything to trend toward greater disorder, in the absence of intelligent intervention.
  
The [[second law of thermodynamics]] states that entropy will never decrease over time within a [[closed system]], defining a closed system as one in which neither matter nor energy may enter or leave.<ref>As first postulated by Lazare Carnot in 1803, entropy is the thermodynamic property which trends toward equilibrium.</ref>
+
The [[second law of thermodynamics]] states that entropy tends not to decrease over time within an [[isolated system]], defining an isolated system as one in which neither matter nor energy may enter or leave.<ref>As first postulated by Lazare Carnot in 1803, entropy is the thermodynamic property which trends toward equilibrium.</ref>
  
Entropy is undeniable and yet creates perhaps insurmountable difficulties for many modern theories of physics.  For example, it renders time asymmetric, resulting in an [[arrow of time]] that is impossible to reconcile with the [[theory of relativity]].  Entropy casts doubt on whether physical laws or the speed of light are invariant and perpetual. Increasing entropy renders the [[theory of evolution]] implausible, because that theory claims that order is increasing.  [[Liberal denial]] is thus common in ignoring the significance of the increase in disorder.
+
Entropy is undeniable and yet creates perhaps insurmountable difficulties for many modern theories of physics.  For example, it renders time asymmetric, resulting in an [[arrow of time]] that is impossible to reconcile with the [[theory of relativity]]. Increasing entropy renders the [[theory of evolution]] implausible, because that theory claims that order is increasing.  [[Liberal denial]] is thus common in ignoring the significance of the increase in disorder.
 +
 
 +
The entropy of a system only depends on the state the system is currently in. The change in entropy therefore only depends on the initial and final states of that system, and not the path taken by the system to reach that final state.
  
 
==Definitions==
 
==Definitions==
 
===Thermodynamic definition===
 
===Thermodynamic definition===
In classical [[thermodynamics]], if a small amount of energy dQ is supplied to a system from a reservoir held at temperature T, the change in entropy is given by
+
In classical [[thermodynamics]], if a small amount of energy <math>dQ</math> is supplied to a system from a reservoir held at temperature <math>T</math>, the small change in entropy, <math>dS</math>, is given by
  
 
<math>
 
<math>
Line 18: Line 20:
 
\Delta S=\int_{i}^{f}\frac{dQ}{T}
 
\Delta S=\int_{i}^{f}\frac{dQ}{T}
 
</math>
 
</math>
 +
 +
where <math>i</math> and <math>f</math> represent the initial and final states of the system.
 +
 +
====Example of Thermodynamic Entropy====
 +
 +
Consider a cup of [[coffee]], of mass <math>m = 0.1 \mbox{kg}</math> and since it is mostly water, [[Specific heat|specific heat capacity]] <math>c = 4.2 \times 10^{3} \ \mbox{J} \ \mbox{kg}^{-1} \ \mbox{K}^{-1}</math>. We shall assume that both <math>m</math> and <math>c</math> are constant. If we leave the coffee for a while, it shall cool from  <math>T_i = 50^{\circ} \mbox{C}</math> to <math>T_f = 20^{\circ} \mbox{C}</math>. The change in entropy is, from above:
 +
 +
<math>
 +
\Delta S=\int_{i}^{f}\frac{dQ}{T}
 +
</math>
 +
 +
We can relate the change in [[heat]], <math>dQ</math>, to the change in [[temperature]], <math>dT</math> by <math>dQ = mc \, dT</math>. Then we can write:
 +
 +
<math>
 +
\Delta S=\int_{T_i}^{T_f}\frac{mc \, dT}{T}= mc \ln{\frac{T_f}{T_i}}
 +
</math>
 +
 +
Plugging our numbers we find that the change in entropy is <math>\Delta S = -40.9 \ \mbox{J} \ \mbox{K}^{-1}</math>. The change in entropy is negative. This does not disagree with the [[second law of thermodynamics]], as it our cup of coffee is an open system, not an [[isolated system]].
  
 
===Statistical mechanics definition 1 (Boltzmann Entropy)===
 
===Statistical mechanics definition 1 (Boltzmann Entropy)===
 
If a system can be arranged in W different ways, the entropy is
 
If a system can be arranged in W different ways, the entropy is
  
<math> S= k_B \log W </math>
+
<math> S= k_B \ln W </math>
 +
 
 +
where <math>k_B</math> is Boltzmann's constant.<ref name="University Physics with Modern Physics">{{cite book
 +
|author=Hugh D. Young and Roger A. Freedman
 +
|title=University Physics with Modern Physics
 +
|publisher=Pearson
 +
|location=San Francisco
 +
|isbn=
 +
|pages=
 +
|quote=
 +
|language=English}}</ref>
 +
 
 +
====Example of Boltzmann Entropy====
 +
 
 +
This example is based on that found in University Physics with Modern Physics.
 +
<ref name="University Physics with Modern Physics">{{cite book
 +
|author=Hugh D. Young and Roger A. Freedman
 +
|title=University Physics with Modern Physics
 +
|publisher=Pearson
 +
|location=San Francisco
 +
|isbn=
 +
|pages=
 +
|quote=
 +
|language=English}}</ref>
 +
To see an example of the statistical nature of this law, consider flipping four coins, the outcome of which can be either heads (H) or tails (T). The [[entropy]] can be calculated using the above.
 +
 
 +
{| class="wikitable"
 +
|-
 +
!ID
 +
!Combinations
 +
!Number of Combinations
 +
!Entropy <math>S</math>
 +
|-
 +
|1
 +
|HHHH
 +
|1
 +
|<math>\ln{1} =0</math>
 +
|-
 +
|2
 +
|THHH<br/>
 +
HTHH<br/>
 +
HHTH<br/>
 +
HHHT
 +
|4
 +
|<math>\ln{4} \approx 1.39</math>
 +
|-
 +
|3
 +
|TTHH<br/>
 +
THTH<br/>
 +
THHT<br/>
 +
HTTH<br/>
 +
HTHT<br/>
 +
HHTT
 +
|6
 +
|<math>\ln{6} \approx 1.79</math>
 +
|-
 +
|4
 +
|TTTH<br/>
 +
TTHT<br/>
 +
THTT<br/>
 +
HTTT
 +
|4
 +
|<math>\ln{4} \approx 1.39</math>
 +
|-
 +
|5
 +
|TTTT
 +
|1
 +
|<math>\ln{1} =0</math>
 +
|}
  
where <math>k_B</math> is Boltzmann's constant.
+
Intuitively, the macro-state with the highest entropy (or "disorder") is the third state as it corresponds the macro-state which has the greatest number of micro-states. In the same way that when you throw four coins, you expect to get two heads and two tails, it is '''not''' impossible to get four heads. It is only less likely.
  
 
===Statistical mechanics definition 2 (Gibbs Entropy)===
 
===Statistical mechanics definition 2 (Gibbs Entropy)===
Line 50: Line 138:
 
===Entropy in Quantum Information Theory (Von Neumann Entropy)===
 
===Entropy in Quantum Information Theory (Von Neumann Entropy)===
  
In entangled systems, a useful quantity is the Von Neumann Entropy, defined (for a system with density matrix <math> \rho </math> by
+
In entangled systems, a useful quantity is the Von Neumann Entropy, defined (for a system with density matrix <math> \rho </math>) by
  
 
<math>
 
<math>
Line 60: Line 148:
 
==See also==
 
==See also==
  
* [[history of entropy|History of the development of entropy]]
+
*[[history of entropy|History of the development of entropy]]
 +
*[[Thermodynamics]]
 +
*[[Second law of thermodynamics]]
 +
*[[Genetic entropy]]
  
 
==References==
 
==References==
 
<references/>
 
<references/>
 
[[Category:Physics]]
 
[[Category:Physics]]
 +
[[Category:Thermodynamics]]
 +
[[Category:Second Law of Thermodynamics]]

Latest revision as of 03:53, November 6, 2017

Entropy is a quantitative measure of the "disorder" in a system. It forms the basis of the second law of thermodynamics, that entropy tends to increase. In other words, the tendency of everything to trend toward greater disorder, in the absence of intelligent intervention.

The second law of thermodynamics states that entropy tends not to decrease over time within an isolated system, defining an isolated system as one in which neither matter nor energy may enter or leave.[1]

Entropy is undeniable and yet creates perhaps insurmountable difficulties for many modern theories of physics. For example, it renders time asymmetric, resulting in an arrow of time that is impossible to reconcile with the theory of relativity. Increasing entropy renders the theory of evolution implausible, because that theory claims that order is increasing. Liberal denial is thus common in ignoring the significance of the increase in disorder.

The entropy of a system only depends on the state the system is currently in. The change in entropy therefore only depends on the initial and final states of that system, and not the path taken by the system to reach that final state.

Definitions

Thermodynamic definition

In classical thermodynamics, if a small amount of energy is supplied to a system from a reservoir held at temperature , the small change in entropy, , is given by

For a measurable change between two states i and f this expression integrates to

where and represent the initial and final states of the system.

Example of Thermodynamic Entropy

Consider a cup of coffee, of mass and since it is mostly water, specific heat capacity . We shall assume that both and are constant. If we leave the coffee for a while, it shall cool from to . The change in entropy is, from above:

We can relate the change in heat, , to the change in temperature, by . Then we can write:

Plugging our numbers we find that the change in entropy is . The change in entropy is negative. This does not disagree with the second law of thermodynamics, as it our cup of coffee is an open system, not an isolated system.

Statistical mechanics definition 1 (Boltzmann Entropy)

If a system can be arranged in W different ways, the entropy is

where is Boltzmann's constant.[2]

Example of Boltzmann Entropy

This example is based on that found in University Physics with Modern Physics. [2] To see an example of the statistical nature of this law, consider flipping four coins, the outcome of which can be either heads (H) or tails (T). The entropy can be calculated using the above.

ID Combinations Number of Combinations Entropy
1 HHHH 1
2 THHH

HTHH
HHTH
HHHT

4
3 TTHH

THTH
THHT
HTTH
HTHT
HHTT

6
4 TTTH

TTHT
THTT
HTTT

4
5 TTTT 1

Intuitively, the macro-state with the highest entropy (or "disorder") is the third state as it corresponds the macro-state which has the greatest number of micro-states. In the same way that when you throw four coins, you expect to get two heads and two tails, it is not impossible to get four heads. It is only less likely.

Statistical mechanics definition 2 (Gibbs Entropy)

Label the different states a thermodynamic system can be in by . If the probability of finding the system in state i is , then the entropy is

where is the Bolzmann constant. This definition is closely related to ideas in information theory, where the definition of information content is very similar to the definition of entropy.

Entropy in information Theory (Shannon Entropy)

For a discrete random variable, entropy is defined as

For a continuous random variable, the analogous description for entropy, which in this case represents the number of bits necessary to quantize a signal to a desired accuracy, is given by

Entropy in Quantum Information Theory (Von Neumann Entropy)

In entangled systems, a useful quantity is the Von Neumann Entropy, defined (for a system with density matrix ) by

where Tr() indicates taking the Trace of a matrix (the sum of the diagonal elements). This is a useful measure of entanglement, which is zero for a pure state, and maximal for a fully mixed state.

See also

References

  1. As first postulated by Lazare Carnot in 1803, entropy is the thermodynamic property which trends toward equilibrium.
  2. 2.0 2.1 Hugh D. Young and Roger A. Freedman. University Physics with Modern Physics (in English). San Francisco: Pearson.