Redefining the kilogram

February 20, 2012, National Physical Laboratory

New research, published by the National Physical Laboratory (NPL), takes a significant step towards changing the international definition of the kilogram – which is currently based on a lump of platinum-iridium kept in Paris. NPL has produced technology capable of accurate measurements of Planck's constant, the final piece of the puzzle in moving from a physical object to a kilogram based on fundamental constants of nature. The techniques are described in a paper published in Metrologia on the 20th February.

The international system of units (SI) is the most widely used system of measurement for commerce and science. It comprises seven base units (meter, kilogram, second, Kelvin, ampere, mole and candela). Ideally these should be stable over time and universally reproducible, which requires definitions based on fundamental constants of nature. The kilogram is the only unit still defined by a physical artifact.

In October 2011, the General Conference on Weights and Measures (CGPM) agreed that the kilogram should be redefined in terms of Planck's constant (h). It deferred a final decision until there was sufficient consistent and accurate data to agree a value for h. This paper describes how this can be done with the required level of certainty. It provides a measured value of h and extensive analysis of possible uncertainties that can arise during experimentation. Although these results alone are not enough, consistent results from other measurement institutes using the techniques and technology described in this paper will provide an even more accurate consensus value and a change to the way the world measures mass – possibly as soon as 2014.

Planck's constant is a fundamental constant of nature which relates the frequency (colour) of a particle of light (a photon) to its energy. By using two quantum mechanical effects discovered in the last 60 years: the Josephson effect and the quantum Hall effect, electrical power can be measured in terms of Planck's constant (and time).

A piece of kit called the watt balance - first proposed by Brian Kibble at the National Physical Laboratory in 1975 - relates electrical power to mechanical power. This allows it to make very accurate measurements of Planck's constant in terms of the SI units of mass, length and time. The SI units of length and time are already fixed in terms of fundamental and atomic constants. If the value of h is fixed, the watt balance would provide a method of measuring mass.

Dr Ian Robinson, who leads the project at the National Physical Laboratory, explains how the watt balance works: "The watt balance divides its measurement into two parts to avoid the errors which would arise if real power was measured. The principal can be illustrated by considering a loudspeaker placed on its back. Placing a mass on the cone will push it downwards and it can be restored to its former position by passing a current through the speaker coil. The ratio of the force generated by the current is fixed for a particular loudspeaker coil and magnet and is measured in the second part of the experiment by moving the speaker cone and measuring the ratio of the voltage produced at the speaker terminals to the velocity of the cone.

When the results of the two parts of the experiment are combined, the product of voltage and current (electrical power) is equated to the product of weight and velocity (mechanical power) and the properties of the loudspeaker coil and magnet are eliminated, leaving a measurement of the weight of the mass which is independent of the particular speaker used."

Measurements of h using watt balances have provided uncertainties approaching the two parts in one hundred million level, which is required to base the kilogram on Planck's constant. Thanks to improvements highlighted in the paper published today, measurements at the National Research Council in Canada, which is now using the NPL equipment, look set to provide considerably greater accuracy.

Another set of data comes from NIST, the USA's measurement institute. Currently the watt balance at NIST is showing slightly different results and the differences are being investigated. If the results are found to be consistent, it will be the start of the end for the physical kilogram.

A Planck based would mean a universal standard that could be replicated anywhere at any time. It will also bring much greater long-term certainty to scientists who rely on the SI for precise measurements, or on h itself. The watt balance would provide a means of realising and disseminating the redefined unit of mass.

Dr Robinson concludes: "This is an example of British science leading the world. NPL invented the watt balance and has produced an apparatus and measurements which will contribute to the redefinition. The apparatus is now being used by Canada to continue the work, and we anticipate their results will have lower uncertainties than we achieved, and the principle is used by the US and other laboratories around the world to make their own measurements."

"This research will underpin the world's measurement system and ensure the long term stability of the very top level of mass measurement. Although the man on the street won't see much difference - you'll still get the same 1kg bag of potatoes – these standards will ultimately be used to calibrate the world's weighing systems, from accurate scientific instruments, right down the chain to domestic scales."

Explore further: NIST improves accuracy of 'watt balance' method for defining the kilogram

More information: The paper: Toward the redefinition of the kilogram: A measurement of the Planck constant using the NPL Mark II watt balance is published in Metrologia, the leading international measurement science journal, published by IOP Publishing on behalf of Bureau International des Poids et Mesures (BIPM).

Related Stories

Redefining the kilogram and the ampere

September 29, 2011

New research using graphene presents the most precise measurements of the quantum Hall effect ever made, one of the key steps in the process to redefine two SI units.

Recommended for you

Information engine operates with nearly perfect efficiency

January 19, 2018

Physicists have experimentally demonstrated an information engine—a device that converts information into work—with an efficiency that exceeds the conventional second law of thermodynamics. Instead, the engine's efficiency ...

Team takes a deep look at memristors

January 19, 2018

In the race to build a computer that mimics the massive computational power of the human brain, researchers are increasingly turning to memristors, which can vary their electrical resistance based on the memory of past activity. ...

Artificial agent designs quantum experiments

January 19, 2018

On the way to an intelligent laboratory, physicists from Innsbruck and Vienna present an artificial agent that autonomously designs quantum experiments. In initial experiments, the system has independently (re)discovered ...


Adjust slider to filter visible comments by rank

Display comments: newest first

2.5 / 5 (2) Feb 20, 2012
A Planck based kilogram would mean a universal standard that could be replicated anywhere at any time.

As long as you're in a lab with million dollar equipment.

It's all well and good, but in practice no-one can check how long a meter is either, by using a laser and an atomic clock, because such tools are simply not available to everyone. You'll still have to send your weights and your measuring sticks to be calibrated in some special laboratory just like before.
1 / 5 (7) Feb 20, 2012
In my theory the changing mass of kilogram prototype is connected with uncertainties of gravitational constant and dilatation of iridium meter prototype recently observed. http://www.physor...s64.html IMO the solar system is passing trough dense cloud of dark matter or maybe gravitational shadow of galactic center. The increased density of low energy neutrinos and gravitational waves makes the vacuum more dense and massive objects are swelling in it and become less heavy. These changes are minute, but the replacement of one prototype with another one will not help us with this situation, until the experimental apparatus cannot account to the changes of vacuum density. For example, if we fix the kilogram definition with Watt's balance, the definition of meter will fluctuate instead.
1 / 5 (6) Feb 20, 2012
When the massive object appears inside of dense cloud of dark matter, it will expand, i.e. it becomes less dense and heavy (of lower gravity) and more transparent (fine structure constant will increase and converge to the unitary valie). On the other hand, the contemporary SI definition of meter is based on the wavelength of light in vacuum, so that the time will slow down with the same rate, like the speed of light, so we cannot observe any difference with it. But the length of measures based on iridium meter prototype will expand with compare to laser meter prototype in the same way, like the arms of Watt balance, so we will be forced to recalibrate them often. On the other hand, this recalibration will enable us to observe the changes of vacuum density more reliably. The only problem is, the contemporary accuracy of Watt's balance is one and half of order of magnitude bellow the value required.
3.7 / 5 (3) Feb 20, 2012
"In my theory the changing mass of kilogram prototype" - Kinedryl

I thought that was part of your theory of advanced hyperfoombidic flush toilets.

Please familiarize yourself with what is required before an idea becomes a scientific theory.

1 / 5 (6) Feb 20, 2012
Vacuum density is indeed a concept of dense aether theory. But this theory doesn't imply, such a density must change in historical perspective. I cannot predict, whether the density of vacuum will change in the further moment positively or negatively. But if it changes into some direction, we can predict the sign of related effects and events and judge, if they're really related each other. In this sense this theory is testable.

I don't care whether some model is considered a "scientific" with scientific establishment, because this establishment has an apparent tendency to ignore all uncomfortable ideas and findings for years, so that it cannot serve as a criterion of itself. From this reason I do care only, if my ideas are correct of wrong. If they will be proven correct, I presume, they will be labelled "scientific" automatically, when all their opponents will die out.
1 / 5 (4) Feb 20, 2012
For example, the concept of black hole has been originally predicted with geologist John Michell in a personal letter written to Henry Cavendish in 1783, where he proposed an idea of a body so massive that even light could not escape. Is such idea "scientific enough"? Isn't it the whole basis of the later model of black holes in general relativity theory?
5 / 5 (1) Feb 20, 2012
"In my theory the changing mass of kilogram prototype" - Kinedryl

I thought that was part of your theory of advanced hyperfoombidic flush toilets...

I had exactly the same thought except I was thinking it was about what was being flushed into the hyperspace created by hyperfoombidic flush toilets.
1 / 5 (4) Feb 21, 2012
In my theory the global warming effects (which are observable across whole solar system) result from the same source, like the dilatation and lost of weight of iridium prototypes, recent fluctuations of gravity constant and speed of light constants, the recent increasing of asteroid density and volcanic activity, etc...

Dense aether model enables to explain, how these phenomena are related mutually - but it cannot predict them. After all, general relativity theory predicts the fall of meteorite, when it appears at the proximity of Earth, but it cannot predict, when/how such a meteorite emerges. You should have additional theory / model for it.
1 / 5 (4) Feb 21, 2012
The characteristic for these phenomena is their low degree of correlation. Their connection will emerge just after consideration of sufficiently general theory. This is the problem for contemporary science, which is A) overspecialized and fragmented, we have many experts, but they're all dealing with narrow area of physics B) it requires relatively high degree of correlation for claiming some connection as a real. We could get sufficiently high degree of correlation, if we would consider all these phenomena together as a whole - but we cannot, because every expert is able to handle only limited number of correlations in qualified manner. In this way many boundary phenomena, which consist of many correlations may escape the attention of specialized experts. The dark matter origin of global warming is not only phenomena of this category: the cold fusion or various psychic effects which do require a broad multidisciplinary qualification have the same problem with their acceptance.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.