New center to replace oil and gas with sustainable chemistry

Many of the things that surround us are chemically derived from fossil gas and oil—from washing powders to phones to pharmaceuticals. As such, chemistry contributes to CO2 emissions in the same way as, for example, flying ...

Quantum vacuum: Less than zero energy

Energy is a quantity that must always be positive—at least that's what our intuition tells us. If every single particle is removed from a certain volume until there is nothing left that could possibly carry energy, then ...

page 1 from 10

Entropy

Entropy is a concept applied across physics, information theory, mathematics and other branches of science and engineering. The following definition is shared across all these fields:

where S is the conventional symbol for entropy. The sum runs over all microstates consistent with the given macrostate and is the probability of the ith microstate. The constant of proportionality k depends on what units are chosen to measure S. When SI units are chosen, we have k = kB = Boltzmann's constant = 1.38066×10−23 J K−1. If units of bits are chosen, then k = 1/ln(2) so that .

Entropy is central to the second law of thermodynamics. The second law in conjunction with the fundamental thermodynamic relation places limits on a system's ability to do useful work.

The second law can also be used to predict whether a physical process will proceed spontaneously. Spontaneous changes in isolated systems occur with an increase in entropy.

The word "entropy" is derived from the Greek εντροπία "a turning towards" (εν- "in" + τροπή "a turning").

This text uses material from Wikipedia, licensed under CC BY-SA