The current definition of a second is stated here and I found a presentation on the BIPM site which discusses plans to change to a "better" definition of a second. You can find the presentation here. The plan is to use a new definition based on "an optical transition". In what way does the current definition fall short? The BIPM presentation tries to explain why we need a new definition, but I don't have the background to understand it.
Answer
As a rule of thumb, the relative stability and precision which you can hope to achieve with any oscillator is limited by the number of periods you can observe your system for. For the current definition of a second the oscillator is a microwave transition at about $9\textrm{ GHz}\approx 10^{10} \textrm{ Hz}$. Since trapping the atoms shifts the energy levels, you need to chuck them up and measure them when they fall back down, which means that the effective interaction time is of the order of seconds.
On the other hand, using a transition in the optical part of the spectrum would keep observation times at about the same order but increase the frequency of the radiation up to about $10^{15}\textrm{ Hz}$. As Gill points out, this would mean uncertainties lower by two or three orders of magnitude, simply because you observe a much bigger number of periods.
The definition of a second is fine as it is for what we're doing now. However, cold-ion clock technology is indeed close to this fundamental limit. As the presentation shows (page 3), optical clocks have overcome many of the technical reasons that make them difficult to work with, as well as some fundamental issues solved by the frequency comb, to catch up with fountain clocks. It is therefore the time to ask whether we shouldn't make optical transitions the fundamental standard and stop worrying about calibrating them with a (less accurate) fountain clock.
No comments:
Post a Comment