Oddbean new post about | logout
 i like this article from november last year, in the run up to the ITU world radiocommunication conference, about the increasingly firm plans to abolish the leap second

https://web.archive.org/web/20231103043101/https://www.nytimes.com/2023/11/03/science/time-leap-second.html

however the ITU-R WRC23 didn't agree on the "leap minute" proposal, instead they decided "the maximum value for the difference between UT1 and UTC should be no less than 100 seconds"

https://www.itu.int/pub/R-ACT-WRC.15-2023 pp. 398 - 400 
 unrelated to leap seconds, more about time and frequency metrology in general, the BIPM is carefully working towards a new definition of the second based on optical frequency standards instead of caesium

https://www.bipm.org/en/-/2024-02-20-redefinition-second

the plan is to use a quantum transition that produces photons at a higher “optical” frequency than caesium’s 9.2GHz, which should make it possible to improve the precision of atomic clocks by a factor of 100 or more 
 the definition of the second is foundational to the (new) SI since almost all the other units depend on it

so making the practical realization of the second more precise can make the other units more precise too

but there isn’t a clear winner for optical clocks like caesium was 70 years ago, so the metrologists are taking this change more slowly and carefully 
 atomic clocks are incredibly precise

they are currently at 10^-16 accuracy which is about the same as double precision floating point 
 there’s an interesting (more radical) suggestion in this metrologia paper to make the second more like the other new SI units

“fixing the numerical value of one more fundamental constant, in addition to c, h and e. From the fundamental standpoint, a good choice for this constant is the electron mass m_e, in which case the system of units is set by the relations:

m_e = M kg,

where M is the defining value, completed by the other defining relations for c, h, e, k_B, N_A and K_cd.” 
 later on they say “Currently, the value of m_e has an uncer-tainty of 3.0 parts in 10^10, while the uncertainty in the Rydberg constant is 1.9 part in 10^12. These uncertainties are several orders of magnitude larger than the present realizations of the unit of time of the current SI system (few parts in 10^16) and even further away from the capabilities of optical frequency standards (10^−18 or better). Consequently, Option 3 is not practical in the current state of science and technology.” 
 i am imagining these metrologists saying, “oh it would be *so* elegant if we could define it this way! sadly, that would be a million times worse…” 
 there’s a table of candidate optical standards, ranging from 0.4 to 1.1 THz, using Hg or Al or Yb or Sr or Ca, in trapped ion optical clocks or neutral atom optical lattice clocks

it says 4 labs have Yb lattices and 3 have Sr lattices in production contributing to TAI

then one lab using oldskool rubidium, i’m pretty sure that’s the USNO’s rubidium fountains, which are unusual in (a) using rubidium instead of caesium and (b) operating continuously rather than for calibrating other clocks 
 seriously high-quality jargon

 “In addition to continued advances in cavity performance mentioned earlier, there are efforts in parallel to develop novel measurement protocols that mitigate the limitations caused by reference cavity noise, such as”

get this

“zero-dead time interrogation, correlation spectroscopy, and dynamic decoupling of laser phase noise in compound atomic clocks.” 
 something something retro-encabulator