New Atomic Clock 1000 Times More Accurate 313
stevelinton writes "The UK National Physical Laboratory has a new atomic clock potentially 1000 times more accurate than current cesium clocks: to within 1 second in about 30 billion years!
This could lead quite soon to a new definition of the second, and in a while to improved resolution in GPS successor systems. More interestingly, there are theories that some of the universe's fundamental dimensionless constants may have changed by a parts in a million over the last 10 billion years or so. These clocks are so accurate that they should be able to detect these changes over a year or two."
Re:Why do this? (Score:4, Informative)
Re:Why do this? (Score:5, Informative)
There are also applications in scientific research -- I mentioned detecting changes in fundmental constants in the story, it might also help allow very long baseline interferometry (where two radio telescopes thousands of miles apart obtain the same resolution as one telescope thousands of miles wide) at higher frequencies, pushing into the long IR.
Re:Wrist Watch? (Score:2, Informative)
My favorite quote is "Batteries are included (they last about 45 minutes but are rechargeable)."
Re:Why go any further (Score:5, Informative)
Not really new (Score:5, Informative)
Changes in Constants? (Score:4, Informative)
Does anyone know more about this?
Re:Great! (Score:5, Informative)
Accuracy is how close the measurement is to the actual value, precision is how much often the measurement is in agreement with the value.
Showing the wrong time, no matter how precise, doesn't mean much. The new clock is more accurate.
Bad reporting (Score:5, Informative)
It's not 1000 times more accurate, it's 3 times more accurate (than the NIST's mercury ion resonator). The figure of 1000 is what they think the technology in the future, but that's purely hypothetical.
NPL's errors -
Bombarding an ion with a blue laser in order to cool it is _in_no_way_ similar to firing a beam of light at a mirror-ball. Mirror balls do not get cooler when you fire beams of light at them. Explanations that use inappropriate analogies are as useful as wearing tie-died lab-coats in night-clubs.
If "one part in 10^18" is "nearly a thousand times more accurate than the best clocks of today", then today's best clocks must be accurate to 1 part in 10^15. Therefore this new clock, being "three times more accurate than the Americans", "3.4 parts in 10^15", cannot be the be the best clock of today. Either that or someone in NPL can't do simple maths.
FP.
Second Minute (Score:5, Informative)
I thought that was a neat and strange word origin (if correct).
to quote him...
"When they came to require still smaller subdivisions of time, they divided each minute into 60 still smaller parts, which, in Queen Elizabeth's days, they called "second minutes" (i.e. small quantities of the second order of minuteness). Nowadays we call these small quantities of the second order of smallness "seconds"."
Re:Accurate clocks causing us problems (Score:4, Informative)
Unfortunately the world has not completely standardized on when and how these leaps seconds are to be inserted
Rubbish. This has been standardised [navy.mil] for many years.
Re:this might be a stupid question but... (Score:5, Informative)
For that matter, if the talk I heard a year ago about the work at NIST on this very thing is still true, these atomic clocks can't maintain their accuracy for more than a week or so.
The "one second in 30 billion years" is a convenient extrapolation so that non-scientific persons get an idea of how accurate it is. It would be more correct to say that the atomic clock, in situations of normal operation, is accurate to one part in 10^18.
For that matter, it doesn't hold a wall-clock type value, like saying it's exactly 22:04:17.832... Our choice of reference for time (say, when "noon" is), is difficult to measure and quite arbitrary. Instead, you're interested in, say, how long a particular process takes (light making a round trip, or atomic decay), measured to a very high degree of accuracy (and precision).
Of course units of time are arbitrary. All units are arbitrary. Dimensions (length, time, etc.) and fundamental constants are non-arbitrary, but don't have any "natural" expression in terms of the units we use. (The most natural system of units is arguably expressing everything in terms of fundamental constants.) Seconds, minutes, hours, and years have arbitrary definitions for our convenience, just like any other unit.
Re:Second Minute (Score:3, Informative)
a. F. seconde, ad. med.L. secunda, fem. of L. secundus SECOND a., used ellipt. for secunda minuta, lit. 'second minute', i.e. the result of the second operation of sexagesimal division; the result of the first such operation (now called 'minute' simply) being the 'first' or 'prime minute' or 'prime' (see PRIME n.2 2)
Re:Accurate distance too? (Score:1, Informative)
If this new atomic clock can be used to tune laser diodes even better than before, then this should improve distance measurement accuracy.
Better atomic clouds could probably also improve "time of flight" distance measurements. "Time of flight" means the time taken for a brief pulse of light to exit the source, reflect off the target, and reach the source again. Do the math and you get distance measureed.
This could also be used to triangulate an RF emission source using three towers, but it's not like that can't be done with sufficient accuracy today using existing atomic clocks, or even simply with directional antennas.
Re:Second Minute (Score:2, Informative)
second (n.)
"one-sixtieth of a minute," 1391, from O.Fr. seconde, from M.L. secunda, short for secunda pars minuta "second diminished part," the result of the second division of the hour by sixty (the first being the "prime minute," now called the minute), from L. secunda, fem. of secundus (see second (adj.)). Shortened form sec first recorded 1860.
So sort of true, but of course the use of second and minute as time units originates in Latin.
Re:Second Minute (Score:4, Informative)
According to multiple sources (see Eli Maor, Trigonometric Delights, Princeton Press, etc):
"The Greeks called the sixtieth part of a degree the "first part," the sixtieth part of that the "second part,"...
In Latin the former was called pars minuta prima ("first small part") and the latter pars minuta secunda ("second small part"),
from which came our minute and second."
The actual subdivisions are Babylonian in origen, since they invented the concept of the 24hr day
with sexagesimal units of time (hours) which were subdivided a SECOND time into 60 TINIER chunks (seconds).
Notice also that most romance languages have words for this unit of time that not only predate Queen Elizabeth's birth, but the English language itself.
Re:Give or take a year... (Score:3, Informative)
Fundamental constants of Nature changing over the Universe history and/or over space is a topic of debate in the physics community (in which I include myself, being a grad student in physics).
There is no compelling theoretical reason that suggests this running of the fundamental constants. There are some experimental (astrophysical) evidence that could be explained in this way, and several models have been developed. They would have far reaching consequences, changing our views on cosmology and the Standard Model of particle physics.
Pinkfud, your "simple" argument is a trivialization on the issue and doesn't make much sense, in fact. For example, no observer could stand outside the universe, because there's nothing outside the universe.
I don't know if you got your ideas from a "popularising" science magazine (don't ever trust them) or misinterpreted a more serious source. But keep researching into it, and if you got the opportunity to discuss it with a physicist, do it.
P.S.: I pretend this comment just to point to you that your understanding is incorrect and to encourage your interest for physics.
Re:Second Minute (Score:1, Informative)
Seconds comes directly from latin and has found its way into Germanic and Slavic languages via that etymology.
To disprove that in two seconds: Other languages use the exact same word from the same source. And they did so long before Queen Elizabeth.
It found its way into Old English in 1391 (first occurence) from Old French as 'seconde'. The rest, hewever ist pretty accurate. The big minutes were called "prime minute" and the small ones "second minute", even in French.
Don't mod up anything that sounds smart, people. Mods...
Re:Accurate distance too? (Score:1, Informative)
Re:Great! (Score:1, Informative)
So what is accuracy in this context? Primary frequency standards based upon cesium attempt to realize the definition of the second. They do so by struggling to reduce uncertainty. So, when you build a primary frequency standard, you toss a cable out into the room and say, "Here's a signal with frequency X plus or minus Y."
Some folks call the Y the accuracy, but it's really the uncertainty.
As far as an optical frequency standard goes, there is no definition to stand upon regarding the frequency they produce. However, you can take a hard look at all know sources of error and state their root-sum-square as the uncertainty of the frequency it produces.
Think about that watch described above, and think of its ticking as a frequency standard and you'll see that if it is neither fast nor slow then it has good accuracy, regardless of how the "ticks are labeled."
Re:So, can someone please tell me... (Score:3, Informative)
The standard press description is a little confusing. A good way to think about the subject is that atomic clocks are extremely good frequency standards, which incidentally makes them good time standards as well (if I have a pendulum that oscillates once per second, I can measure time by counting the number of oscillations).
The idea behind all atomic clocks is that atoms are very picky about the kinds of light they absorb and emit (that's how astronomers can tell what kinds of atoms make up stars). There are some frequencies of light that interact very strongly with any given kind of atom, and some frequencies where the light barely interacts at all. When the atom absorbs a photon, it jumps to a higher energy state, when it emits a photon, it jumps to a lower energy state.
If you look carefully at the spectrum of light that an atom absorbs or emits, you'll find that the atom isn't equally picky about every kind of transition that it can make. There are some transitions (in cesium, they're called hyperfine transitions) where the atom isn't just picky, it's positively fastidious. What you'll find is that if you want to excite these transitions, you'll have to shine light that is exactly the right frequency, plus or minus a tiny amount (the "linewidth" -- literally, if you plotted absorbtion vs. frequency, the width of the peak you would see on the graph.)
So reasoning backwards, if I'm shining a laser at a cavity of cesium atoms and I measure that they're strongly absorbing the light, then I know the frequency of the laser has to be *exactly* the frequency that excites the atom, plus or minus a tiny linewidth. So I can count the oscillations of my laser and figure out how much time has elapsed. That's basically how an atomic clock works.
But if you wanted to get really anal about it, you could point out that I really don't know the exact frequency of my laser at all -- all I know is that it's the frequency of the atomic transition, plus or minus the linewidth. So if a hypothetical Alice and Bob in adjacent laboratories had lasers locked to the same transition, it's possible that Alice could have her laser locked at the atomic transition frequency minus a linewidth, while Bob has his locked at the atomic transition frequency plus a linewidth. (The linewidth was chosen to be really small, but it's still not zero: also, really narrow lines are hard to lock a laser to, so there's always a tradeoff involved.) So Alice and Bob's clocks will drift a tiny bit relative to each other. But because the linewidth is so small, it will take an insane number of oscillations before Bob measures that one more second has passed than Alice measures. The "1 second in 30 billion years" is just a reflection of this: it measures the linewidth of the transition relative to the frequency of light involved.
The appeal of using Mercury or Strontium atoms (small world: one of my best friends is also working on a Strontium time standard) is that they have a special transition that is even narrower relative to the transition frequency than Cesium's hyperfine transition.
Re:How do they know? (Score:3, Informative)
Clocks can also be run in groups. With some mathematics, the group can produce a result that is more accurate than a single clock.
If you have a detailed knowledge of the physics involved in the operation of a clock, the possible sources of error can be modeled and predicted.
Re:Compared to what? (Score:3, Informative)