The AI-powered English dictionary
countable and uncountable, plural entropies
A measure of the disorder present in a system.
(Boltzmann definition) A measure of the disorder directly proportional to the natural logarithm of the number of microstates yielding an equivalent thermodynamic macrostate. examples
(information theory) Shannon entropy examples
(thermodynamics, countable) A measure of the amount of energy in a physical system that cannot be used to do work. examples
The capacity factor for thermal energy that is hidden with respect to temperature. examples
The dispersal of energy; how much energy is spread out in a process, or how widely spread out it becomes, at a specific temperature. examples
(statistics, information theory, countable) A measure of the amount of information and noise present in a signal. examples
(uncountable) The tendency of a system that is left to itself to descend into chaos. examples