Statistical mechanics is the use of statistics and probability theory to analyze the behavior of a vast number of interacting particles. It provides a microscopic level interpretation of macroscopic thermodynamics quantities like work, heat and entropy. The essential problem in quantum mechanics is to determine the distribution of a certain amount of energy E among N identical objects.
A fundamental definition in statistical mechanics is that of a microstate. A microstate, is a complete microscopic description of the system. For example, for a gas sample, its microstate is specified by the position and velocity of every molecule of the gas. The fundamental postulate of statistical mechanics is that for an isolated system in equilibrium, all its accessible microstates are equally probable.
Entropy in statistical mechanics
A macrostate is a partial specification of the system in terms of macroscopic quantities. For example, for the sample of gas, a possible macrostate is one in which all molecules are on the left hand side of the container (state A), and another macrostate would be one in which the molecules are evenly distributed among the available space (state B) . In general, there are several microstates corresponding to a single macrostate. The number of microstates corresponding to a macrostate, is called the weight function. The higher the weight function, the higher the probability that the system is in such macrostate. All systems evolve in time to the macrostates with the highest number of microstates. As there are a lot more microstates for state B than there are for state A, if a gas is initially located on the left hand side of the container, it will disperse to the whole container as time passes. There is nothing which prohibits the gas to suddenly go back to the left side, it is just that there is extremely unlikely that it will happen, because of the extremely higher probability that the system will be in state B. Basically, a system always evolve from a lower probable to a higher probable state. This gives us a statistical definition for entropy: Entropy is related to the number of microstates of the system. A system with many microstates has a hight entropy, and one with few microstates has a low entropy. So, the thermodynamic observation that entropy always increases, can be explained by the fact that systems go from low probable (or low entropy) states to high probable (or high entropy) states.