And with that, I mourn the obsolescence of my last remaining "absolute unit" joke and will have to develop something equally satisfying based on dressed lumber to replace it.
The difference is more epistemological than practical. Before, 6.022x10^23 was the measured value of the conversion factor between a gram (exactly 1/100 of the Prototype Kilogram) and a dalton (exactly 1/12 the mass of a carbon-12 atom). After the redefinition, a dalton is defined as being 1/6.022x10^23th of a gram, and measured to be approximately equal to 1/12 the mass of a carbon-12 atom. In essence, the dependency chain was reversed, so now it all flows from the kilogram outwards.
For the next revision: remove the mole from the SI. It's completely pointless. A number is not a unit!
Instead replace it by the bit, the fundamental unit of information.
It seems that the reason is that chemists don't care how many atoms there are in their reactions. It would be inconvenient: fractions of atoms don't make sense, and the numbers are far from human scale. They just want an agreement, so that even if they don't know the exact number of atoms in their test tube, they know they all have the same number. That's what units are for.
The definition of units are done in the most convenient and precise way possible. And they are updated as science progresses. For the kg, the artifact used to be the best we had, alternative methods weren't precise enough, they changed that to the Kibble balance because now, it is better. It may change again at a later time if we find something better.
Back to the mole, it used to be defined as the ratio between the kilogram and the mass of a 12C atom because it is the best we had. Now that we can count atoms with more precision, we decided to change it to just the Avogadro constant, which becomes fixed. Again it might change. What may happen (or may have happened, see Avogadro project) is that we can define the kilogram using the mole, making it fundamental.
The point of the mole unit is this: 1 mole of nucleons (protons or neutrons) has a mass of 1 gram. So while a mole is "just" a name for a large number (akin to billion or quintillion), the actual underlying exact count isn't very important. Instead what matters is making the stochiometric computations easy to do from first principles; given the formula of a molecule, and the atomic weights of each element, you can compute the mass of one mole, and dividing the mass of your stuff tells you how many moles without any other conversion factor. If we used a different unit of mass than grams, such as pounds, then it would make sense to use a similarly different definition of mole to avoid throwing extra units into the equation.
The difficulty with this simplistic view is that it turns out that nucleons actually differ in mass slightly, depending on which nucleus they're in (and, also, isotopic ratios matter), which means there are a few different definitions you can use for the number of nucleons in 1 gram of stuff. In practice, the difference is small enough that it doesn't matter for most uses, especially if you build it into your table of atomic weights.
You were also never going to test your scale with a weight that is stored in a vault in Paris. So far you would test with a replica of a replica of that weight, in the best case (getting a copy directly from the local standards body). With the new definition in principle anybody can create an object that weighs exactly one kg and sell it to you.
will this increase the probability of adoption of the SI system? I have often had the impression that some nations avoided transitioning to SI out of principle, not wanting to depend on the maintenance and cooperation of another nation?
As part of this SI redefinition, the Boltzmann constant is also getting an exact definition 1.380 649 e-23 J/K (thereby defining the Kelvin).
If we identify the entropy formula
S = kB ln W
as really being about bits, we see that in natural units kB = 1/ln(2) and can eliminate all energies in favor of temperatures (or vice-versa). If you think that entropy is better measured in nats, kB = 1.
The SI units are supposed to be used to define all other units.
The mole is just a unitless number which isn't used for many definitions, while the bit (or preferably, Shannon) has a fundamental use in information theory and the wider field of physics.
The mole is a human specific number, the bit is a property of the universe. Information is a physical quantity, just like mass or length, and deserves to have a unit in the SI system.
You could use other bases, like the nebit (natural log) or Hart (base 10), but bit is most elegant as it uses the smallest integer base possible, 2.
But, if we assume you want to avoid the normalization, then it would seem more sensible, due to numerical efficiency, to use balanced ternary, which actually has, unlike what people who rather cluelessly apply the literal textbook (and terrible) metric of 'Radix efficiency'(an entirely artificial measure originally invented to account for engineering concerns related to vacuum tube spacing.) would presume, higher numeric efficiency than unsigned ternary, as shown here by a then Indian computer science student, Abhijit Bhattacharjee, whose work on this unfortunately only remains accessible via the Internet Archive Wayback Machine:
He actually got a reply by Marvin Minsky(to an email he had sent him) at some point (also available via archive.org, but only if you do a lot of trickery which I can't do from my mobile phone, I've however previously verified the quote he provided on the bottom of the page and it does, by all things I could account for, indeed seem genuine), which I'll quote here:
"Yes it is a fascinating subject, and your explorations and descriptions are extremely readable. I hope it gets the interest of more mathematicians."
I've always been mildly irritated that the name of the SI mass base unit has a prefix. Second, liter, meter, Newton, kilogram. Were I emperor of the world I would slide the name scale up so the mass equal to 2.205lbs was called a gram.
If you were emperor of the world, I would implore you to finance a multibillion dollar project to measure the gravitational constant to higher percision so that we could move to units that actually make sense.
Came here to mention the Planck units, aka God's units :)
Sadly they are a little outside of the human realm of experience, both in their single units (waaay to small usually) and in the units that the list contains (speed, electrical charge, etc; where we like "introductory" units such as length, time, weight)
but these types of decisions should probably be left for whenever people can economically measure the constants with enough precision to be serious about switching.
> Aside, are the mega and giga prefixes ever used for anything except bits and bytes?
Sure, few examples of the top of my head:
- Megatons, as in yield of nuclear weapons. Which is a bit odd unit, should be in J, goddammit!
- Similarly, energy consumption/production. Commonly used units MWh, GWh, TWh. Again, the proper unit would be J, not Wh. Or Mtoe, for Million tons of oil equivalent.
Yes! Aside from the extremely common megahertz and gigahertz, commonly used units are megajoules, gigajoules, megawatts, gigawatts, terawatts, petajoules (when countries are negotiating about energy security), exajoules (when people measure world energy consumption), megohms, gigohms, megalitres (for the capacity of a dam, though teraliters would have been more appropriate), megapascals (of some kind of tensile strength typically), gigapascals (of Young's modulus typically), megaergs (back when people still used ergs), and so on.
Megadeth is named after the term “megadeath”, a unit in Cold War calculations of possible nuclear attacks to designate a million deaths. The 80’s were a grim time. (You can actually see in the film Dr. Strangelove a scene where George C. Scott’s character leans over a binder labeled “World Targets in Megadeaths”.)
Such a choice might get you removed as emperor of the world -- the number related of engineering, scientific, and medical errors might be high. Generations of scientists would remember such an emperor, but not in a positive light.
I share your consternation with the kilogram's name, but it is almost as fundamental to our units as Franklin's implicit, and perhaps unfortunate, choice to call the charge of the electron negative.
Furthermore, the pound is a unit of force. One can find locations (poles and equator) on Earth's surface where the measured weights of an object can differ by at least 0.5% [1], making any definition of mass based upon weight troublesome.
The gram was the original base unit in the French revolutionary metric system. It was defined as the weight of 1 cm^3 of water at the temperature of melting ice. The definition was too imprecise and the kilogram turned out to be more practical, but renaming the unit would have been too much trouble I guess.
Reading about standard kilogram always reminds me 2014 movie "1001 grams" [0]. Not sure how accurate it reflects how the things are actually run there, but was interesting to watch it nevertheless.
Confused, so how do they measure 1 kilo of something now when they need to calibrate the most precise instruments? They count 10^40 photons? How? And how does that translate into a physical mass they want to carry around?
Edit: Seems I completely missed a paragraph somehow. Thanks!
The term of art is "realisation". The article mentions two realisations but does not elaborate: "In practice, there are currently two known methods for measuring such masses with great precision. These are known as the Kibble balance and the single-crystal silicon sphere". You should be able to google from that.
I never understood why a sphere was even in the discussion of a standardized atomic count.
Any rational fractions using pi = pi atoms or molecules. And that's absurd to have r=(3V/4pi)^1/3 . No matter how you mess with it, it's irrational to the extreme!
Now.. Face-centered or ody-centered cubic follows a nice quadratic expansion for integer based counting. We don't need no steenkin pi's!
For most intents and purposes its OK if you are a bit inaccurate. The important thing is that we will never drift further and further away. In 1000 years time we will still have exactly the same kg as we have now.
Transitioning is quite interesting, you have to measure your old unit with the new system at a resolution never reached before so that you know you don’t invalidate a large collection of numbers everywhere in the world. Then, depending on your definition, you might choose a special number, Avogadro’s number has to be a perfect cube, number of transitions have to be integer, etc.
>Despite the greatest of precautions, every time the standard kilo was handled — for example, to compare it to another unit that could then be used to calibrate instruments — it would shed some atoms and its mass would be slightly changed. Over its lifetime, that standard kilo is estimated to have lost about 50 micrograms.
Why did the standard kilo ever need to be handled? Just store it on a balance and only handle the comparison unit.
The standard kilo was/is stored in a vault inside several bell jars and only removed once every ~30 years to compare with the world's national reference/transfer standards.
As far as I know, the mechanism by which Le Grand K is drifting with respect to all of the national references is not known.
To leave it on a balance (which must be operated/checked/maintained) would expose the K to much more risk.
At this precision you weigh things twice. Once with the reference on the left arm of the balance and the test object on the right and another time with objects swapped.
This article sounds like it got written by a scientist but got butchered through a PR department.
It talks about the "Mass of a photon" and weighing them directly, rather than the energy-equivalent mass, which is presumably what they are intending to talk about?
I know that zero is a finite number, but since the photon is a massless particle, wouldn't it be impossible to get a kilogram worth of photons bouncing around in an optical cavity?
No, it’s entirely possible in principle. If you build a perfect optical cavity, weigh it, pump a bunch of light in, and weigh it again, you’ll find that the weight went up by ghf/c^2 summed over each photon added to the cavity. Divide by g and you get the sum of the masses of the photons. 1 kg of photons would be enough to destroy your lab and the rest of your city, so only a very advanced civilization would ever do the experiment on this scale.
Photons indeed have no rest mass, but it’s impossible to ever find a photon at rest unless you yourself have no rest mass, in which case you also have no ability to do experiments.
If you had a perfectly reflective impervious box, put in 500g of matter and 500g of antimatter, and shook it, the box would then contain 1000g of gamma ray photons. Same gravitational and inertial mass as 1000g of inert matter. (Due to gravitational blueshifting, the photons hit the bottom of the box slightly harder than the top. This is not a halfbaked science blooper; your local physicist will tell you this is actually correct.) QM says photons have zero 'rest mass', but that's not the same thing as actual zero mass.
I wonder if there is a process that has to be gone through to now update the pound avoirdupois. It is defined as 453.59237 grams but given the formal definition of the gram has changed could there be a need for a formal acknowledgement, or does the law encompass changes in methods so it just happens automagically?
Yeah, jokes aside I think _technically_ the post title should be changed. (Even the article's title is wrong. Though, it wouldn't be the first time NH's title is more accurate than that of the article it points to.)
But here is the answer [1]: "The new definition only became possible when instruments were devised to measure the Planck constant with sufficient accuracy based on the IPK definition of the kilogram."
The new reference kilogram is an engineering marvel, but I’ve always wondered this: why didn’t they just build a reference gram instead? Seems like that would be at least a thousand times easier to pull off.
As covered in the article, the reference object loses some of it's atoms every time it is handled. The smaller the reference object, the larger the relative error induced this way.
Not just that, it's probably just the surface (or near surface) that sublimes. The surface only increases with the square of the radius while the volume increases with the cube. so multiplying the volume by 27 only multiplies the surface by 9.
It was easier and more reliable to manufacture the reference kilo in the past. With the new redefinition, you can make your own reference gram, nanogram, megagram, etc.
Your intuition is correct -- it is possible to build more-precise intercomparison tools at the gram scale than the kilogram scale, but until next Monday, our units are referenced to Le Grand K, and any gram-reference must subdivide the absolute standard. After Monday, you're free to build your own gram-scale Kibble balance and call it an absolute standard.
The reference kilogram is estimated to have lost 50 micrograms over it's lifetime. I think it's safe to say that a reference gram would have lost a similar amount of mass. This would have caused a larger change in the definition of a gram/kilogram over time because the ratio of mass lost in the reference would have been greater.
It's hard to say because the mechanisms for the mass loss are not fully known. In fact, we are just guessing that mass loss actually occurred, and how much did occur. How would we have measured the mass change of the reference kilogram? By definition the reference kilogram was the most stable quantity of mass we could have generated at the time, and the most well protected quantity of mass for the duration of it's service as the reference kilogram.
I think mass loss is probably proportional with either the surface area or the surface area of the handled portion. This is just a hypothesis based on what I suspect the mechanism for mass loss is (friction from air resistance and being touched). Now that we have a more accurate way of measuring mass we could repeat the experiment of trying to preserve a quantity of mass, even doing different variations based on quantity, material, conditions, etc. Such an experiment would be relatively expensive and take a long time for good results, but maybe someone curious with funding will do it.
EDIT: The idea that we could have preserved a gram of material, while measuring it's mass every 40 years, for the last 130 years while it only lost 5 picograms (1/1000th of the estimated mass of the reference kilogram lost) is so crazy I can't believe it.
Basically, it's part of the lie of the mythology of the metric system.
The metric system did some great things (eliminated the use of measures that depended on the substance being measured, dry pints vs wet pints, bushel of wheat versus bushel of oats; simple relationship between units of length, area and volume), some things that were already common at the time (eliminated regional definitions of units; related the volume and mass of water), and some stupid things (metric prefixes).
But at the end of the day, the size of the meter and kilogram were chosen to be very nearly 3 Parisian feet and 2 Parisian pounds, because that made it easier to adopt.
The answer to the question "why is the unit weight the kilogram instead of the gram" is "because the kilogram is about 2 Parisian pounds, which is what everyone at the time actually cared about".
As a scientist, use scientific notation off the base units; you eliminate opportunities for confusion. The prefixes don't really add anything.
The Sun is 1.5 x 10^11 meters from the Earth.
Calling it 1.5 x 10^8 kilometers doesn't help you visualize the distance better, nor calling it 150 gigameters. Now you have three numbers that mean the same thing floating around, and if you accidentally write 1.5 x 10^8 m somewhere instead of km, or read 1.5 x 10^11 km as m, you've just introduced a thousand-fold error.
Common units (eg, Angstrom) make sense when you do not need to convert between them; it's convenient if they are easily convertible to your standard unit, but having mg and ug and g and kg floating around is just an unnecessary headache, you inevitably accidentally interpret a microgram dose as a milligram dose from time to time and poison a patient or eight.
Personally, I find the prefixes helpful when memorizing approximate values, eg for back-of-the-envelope estimates (proton mass is 1GeV, electron mass is .5MeV, room temperature is 25meV, h_bar c is 197 MeV fm, ...)
A valid defense of the metric prefixes (I mean, I still think they're on the balance a mistake, but you do have a point), but ironically, electron-volts are not an SI unit.
Or if you need to write it down anywhere at all; that's why programmers use eN or e+N. Although we really need to get around to standardizing a hexadecimal equivalent.