# What is entropy? By Jeff Phillips

There’s a concept that’s crucial to chemistry and physics. It helps explain why physical processes go one way and not the other: why ice melts, why cream spreads in coffee, why air leaks out of a punctured tire. It’s entropy, and it’s notoriously difficult to wrap our heads around. Jeff Phillips gives a crash course on entropy.

“In statistical mechanics, entropy (usual symbol S) is related to the number of microscopic configurations Ω that a thermodynamic system can have when in a state as specified by some macroscopic variables. Specifically, assuming for simplicity that each of the microscopic configurations is equally probable, the entropy of the system is the natural logarithm of that number of configurations, multiplied by the Boltzmann constant KB. Formally, This is consistent with 19th-century formulas for entropy in terms of heat and temperature, as discussed below. Boltzmann’s constant, and therefore entropy, have dimensions of energy divided by temperature.

For example, gas in a container with known volume, pressure, and energy could have an enormous number of possible configurations of the collection of individual gas molecules. At equilibrium, each instantaneous configuration of the gas may be regarded as random. Entropy may be understood as a measure of disorder within a macroscopic system. The second law of thermodynamics states that an isolated system’s entropy never decreases. Such systems spontaneously evolve towards thermodynamic equilibrium, the state with maximum entropy. Non-isolated systems may lose entropy, provided their environment’s entropy increases by at least that amount. Since entropy is a function of the state of the system, a change in entropy of a system is determined by its initial and final states. This applies whether the process is reversible or irreversible. However, irreversible processes increase the combined entropy of the system and its environment.” https://en.wikipedia.org/wiki/Entropy