New center to replace oil and gas with sustainable chemistry

Many of the things that surround us are chemically derived from fossil gas and oil—from washing powders to phones to pharmaceuticals. As such, chemistry contributes to CO2 emissions in the same way as, for example, flying ...

Quantum vacuum: Less than zero energy

Energy is a quantity that must always be positive—at least that's what our intuition tells us. If every single particle is removed from a certain volume until there is nothing left that could possibly carry energy, then ...

Entropy explains RNA diffusion rates in cells

Recent studies have revealed that within cells of both yeast and bacteria, the rates of diffusion of RNA proteins—complex molecules that convey important information throughout the cell—are distributed in characteristic ...

Materials informatics reveals new class of super-hard alloys

A new method of discovering materials using data analytics and electron microscopy has found a new class of extremely hard alloys. Such materials could potentially withstand severe impact from projectiles, thereby providing ...

No assumptions needed to simulate petroleum reservoirs

Hidden deep below our feet, petroleum reservoirs are made up of hydrocarbons like oil and natural gas, stored within porous rock. These systems are particularly interesting to physicists, as they clearly show how temperature ...

Pressure makes best cooling

Phase transitions take place as heat (i.e., entropy) is exchanged between materials and the environment. When such processes are driven by pressure, the induced cooling effect is called the barocaloric effect, which is a ...

page 1 from 9


Entropy is a concept applied across physics, information theory, mathematics and other branches of science and engineering. The following definition is shared across all these fields:

where S is the conventional symbol for entropy. The sum runs over all microstates consistent with the given macrostate and is the probability of the ith microstate. The constant of proportionality k depends on what units are chosen to measure S. When SI units are chosen, we have k = kB = Boltzmann's constant = 1.38066×10−23 J K−1. If units of bits are chosen, then k = 1/ln(2) so that .

Entropy is central to the second law of thermodynamics. The second law in conjunction with the fundamental thermodynamic relation places limits on a system's ability to do useful work.

The second law can also be used to predict whether a physical process will proceed spontaneously. Spontaneous changes in isolated systems occur with an increase in entropy.

The word "entropy" is derived from the Greek εντροπία "a turning towards" (εν- "in" + τροπή "a turning").

This text uses material from Wikipedia, licensed under CC BY-SA