Entropy is a measure of disorder or randomness in a system. It is a concept used across diverse fields, including thermodynamics, statistical physics, information theory, cosmology, and even economics. Rudolf Clausius, a German physicist, introduced the concept in 1850, initially calling it "transformation-content". The second law of thermodynamics states that in any spontaneous process within a closed system, entropy will either increase or remain constant but never decrease. This law implies that heat energy spontaneously transfers from hotter to colder objects, but not the reverse.
In thermodynamics, entropy is a state variable, meaning its value depends solely on the current state of the system, not on how it reached that state. It's also an extensive property, scaling with the amount of material in the system. While often associated with disorder, entropy also relates to the amount of energy unavailable to do work. In information theory, entropy quantifies the average level of uncertainty or information associated with a random variable. Claude Shannon introduced this concept in his 1948 paper, "A Mathematical Theory of Communication". Everyday examples of entropy increasing include ice melting, sugar dissolving in coffee, and a messy room.