Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Technology Science

New Atomic Clock 1000 Times More Accurate 313

stevelinton writes "The UK National Physical Laboratory has a new atomic clock potentially 1000 times more accurate than current cesium clocks: to within 1 second in about 30 billion years! This could lead quite soon to a new definition of the second, and in a while to improved resolution in GPS successor systems. More interestingly, there are theories that some of the universe's fundamental dimensionless constants may have changed by a parts in a million over the last 10 billion years or so. These clocks are so accurate that they should be able to detect these changes over a year or two."
This discussion has been archived. No new comments can be posted.

New Atomic Clock 1000 Times More Accurate

Comments Filter:
  • Re:Why do this? (Score:4, Informative)

    by Lisandro ( 799651 ) on Saturday November 20, 2004 @01:43PM (#10875294)
    It won't be of any use to the regular Joe. But there's a lot of scientific experiments that rely on accurate time measurements, notably those involving relativistic effects.
  • Re:Why do this? (Score:5, Informative)

    by stevelinton ( 4044 ) <sal@dcs.st-and.ac.uk> on Saturday November 20, 2004 @01:47PM (#10875318) Homepage
    The accuracy of caesium clocks is one of the factors limiting GPS accuracy to a meter or so. These clocks could get that down to a millimeter allowing, for instance, GPS based automated guidance for trucks and automated landing for planes.

    There are also applications in scientific research -- I mentioned detecting changes in fundmental constants in the story, it might also help allow very long baseline interferometry (where two radio telescopes thousands of miles apart obtain the same resolution as one telescope thousands of miles wide) at higher frequencies, pushing into the long IR.
  • Re:Wrist Watch? (Score:2, Informative)

    by TeaQuaffer ( 809857 ) on Saturday November 20, 2004 @01:49PM (#10875325)
    The link of which you speek is here [leapsecond.com]

    My favorite quote is "Batteries are included (they last about 45 minutes but are rechargeable)."

  • by stevelinton ( 4044 ) <sal@dcs.st-and.ac.uk> on Saturday November 20, 2004 @01:52PM (#10875341) Homepage
    Because they're interested in deviations of much less than a second.
  • Not really new (Score:5, Informative)

    by Dolphinzilla ( 199489 ) on Saturday November 20, 2004 @01:56PM (#10875362) Journal
    trapped ion frequency standards are nothing new, NIST made one years ago, the only difference is that NPL uses Strontium instead of Mercury. While it appears to be more accurate than the NIST one, trapped ion standards are not very practical to build or run for everyday use and its not a primary frequency standard, since the definition of the second is in terms of Cesium resonance, only Cesium clocks are primary frequency standards.
  • by TeaQuaffer ( 809857 ) on Saturday November 20, 2004 @02:00PM (#10875380)
    There is a little blip by Chris Carilli [physicsweb.org] about changes in constants. [SIC] and more detailed article here [physicsweb.org].

    Does anyone know more about this?

  • Re:Great! (Score:5, Informative)

    by metlin ( 258108 ) * on Saturday November 20, 2004 @02:08PM (#10875436) Journal
    No, he was right.

    Accuracy is how close the measurement is to the actual value, precision is how much often the measurement is in agreement with the value.

    Showing the wrong time, no matter how precise, doesn't mean much. The new clock is more accurate.
  • Bad reporting (Score:5, Informative)

    by fatphil ( 181876 ) on Saturday November 20, 2004 @02:12PM (#10875461) Homepage
    Slashdot's error -
    It's not 1000 times more accurate, it's 3 times more accurate (than the NIST's mercury ion resonator). The figure of 1000 is what they think the technology in the future, but that's purely hypothetical.

    NPL's errors -
    Bombarding an ion with a blue laser in order to cool it is _in_no_way_ similar to firing a beam of light at a mirror-ball. Mirror balls do not get cooler when you fire beams of light at them. Explanations that use inappropriate analogies are as useful as wearing tie-died lab-coats in night-clubs.

    If "one part in 10^18" is "nearly a thousand times more accurate than the best clocks of today", then today's best clocks must be accurate to 1 part in 10^15. Therefore this new clock, being "three times more accurate than the Americans", "3.4 parts in 10^15", cannot be the be the best clock of today. Either that or someone in NPL can't do simple maths.

    FP.
  • Second Minute (Score:5, Informative)

    by zenzic ( 804840 ) on Saturday November 20, 2004 @02:20PM (#10875507)
    According to Silvanus Thompson in his famous (and awesome!)(c1910) calculus book the word second comes from the term "second minute".

    I thought that was a neat and strange word origin (if correct).

    to quote him...
    "When they came to require still smaller subdivisions of time, they divided each minute into 60 still smaller parts, which, in Queen Elizabeth's days, they called "second minutes" (i.e. small quantities of the second order of minuteness). Nowadays we call these small quantities of the second order of smallness "seconds"."

  • by philip_bailey ( 50353 ) on Saturday November 20, 2004 @02:56PM (#10875688) Homepage

    Unfortunately the world has not completely standardized on when and how these leaps seconds are to be inserted

    Rubbish. This has been standardised [navy.mil] for many years.

  • by blueg3 ( 192743 ) on Saturday November 20, 2004 @03:15PM (#10875791)
    It's an awful point. When you build atomic clocks, you're not interested in measuring how long it takes the earth to go around the sun to great precision. You're not interested in actually keeping time for the next 30 billion years accurate to a second.

    For that matter, if the talk I heard a year ago about the work at NIST on this very thing is still true, these atomic clocks can't maintain their accuracy for more than a week or so.

    The "one second in 30 billion years" is a convenient extrapolation so that non-scientific persons get an idea of how accurate it is. It would be more correct to say that the atomic clock, in situations of normal operation, is accurate to one part in 10^18.

    For that matter, it doesn't hold a wall-clock type value, like saying it's exactly 22:04:17.832... Our choice of reference for time (say, when "noon" is), is difficult to measure and quite arbitrary. Instead, you're interested in, say, how long a particular process takes (light making a round trip, or atomic decay), measured to a very high degree of accuracy (and precision).

    Of course units of time are arbitrary. All units are arbitrary. Dimensions (length, time, etc.) and fundamental constants are non-arbitrary, but don't have any "natural" expression in terms of the units we use. (The most natural system of units is arguably expressing everything in terms of fundamental constants.) Seconds, minutes, hours, and years have arbitrary definitions for our convenience, just like any other unit.
  • Re:Second Minute (Score:3, Informative)

    by mattdm ( 1931 ) on Saturday November 20, 2004 @03:20PM (#10875826) Homepage
    OED backs this up:

    a. F. seconde, ad. med.L. secunda, fem. of L. secundus SECOND a., used ellipt. for secunda minuta, lit. 'second minute', i.e. the result of the second operation of sexagesimal division; the result of the first such operation (now called 'minute' simply) being the 'first' or 'prime minute' or 'prime' (see PRIME n.2 2)
  • by Anonymous Coward on Saturday November 20, 2004 @03:46PM (#10875939)
    What you describe is already practiced in industry, it is simply called laser interferometry.
    If this new atomic clock can be used to tune laser diodes even better than before, then this should improve distance measurement accuracy.

    Better atomic clouds could probably also improve "time of flight" distance measurements. "Time of flight" means the time taken for a brief pulse of light to exit the source, reflect off the target, and reach the source again. Do the math and you get distance measureed.

    This could also be used to triangulate an RF emission source using three towers, but it's not like that can't be done with sufficient accuracy today using existing atomic clocks, or even simply with directional antennas.
  • Re:Second Minute (Score:2, Informative)

    by JanPeterBalkenende ( 820739 ) on Saturday November 20, 2004 @05:35PM (#10876625)
    From http://www.etymonline.com/ [etymonline.com]:

    second (n.)
    "one-sixtieth of a minute," 1391, from O.Fr. seconde, from M.L. secunda, short for secunda pars minuta "second diminished part," the result of the second division of the hour by sixty (the first being the "prime minute," now called the minute), from L. secunda, fem. of secundus (see second (adj.)). Shortened form sec first recorded 1860.

    So sort of true, but of course the use of second and minute as time units originates in Latin.
  • Re:Second Minute (Score:4, Informative)

    by Anonymous Coward on Saturday November 20, 2004 @05:38PM (#10876645)
    I guess while We're at it, Queen Elizabeth should be credited with the invention of the Time Machine as well.

    According to multiple sources (see Eli Maor, Trigonometric Delights, Princeton Press, etc):

    "The Greeks called the sixtieth part of a degree the "first part," the sixtieth part of that the "second part,"...

    In Latin the former was called pars minuta prima ("first small part") and the latter pars minuta secunda ("second small part"),
    from which came our minute and second."

    The actual subdivisions are Babylonian in origen, since they invented the concept of the 24hr day
    with sexagesimal units of time (hours) which were subdivided a SECOND time into 60 TINIER chunks (seconds).

    Notice also that most romance languages have words for this unit of time that not only predate Queen Elizabeth's birth, but the English language itself.
  • by agraboso ( 832821 ) on Saturday November 20, 2004 @05:41PM (#10876660) Homepage

    Fundamental constants of Nature changing over the Universe history and/or over space is a topic of debate in the physics community (in which I include myself, being a grad student in physics).

    There is no compelling theoretical reason that suggests this running of the fundamental constants. There are some experimental (astrophysical) evidence that could be explained in this way, and several models have been developed. They would have far reaching consequences, changing our views on cosmology and the Standard Model of particle physics.

    Pinkfud, your "simple" argument is a trivialization on the issue and doesn't make much sense, in fact. For example, no observer could stand outside the universe, because there's nothing outside the universe.

    I don't know if you got your ideas from a "popularising" science magazine (don't ever trust them) or misinterpreted a more serious source. But keep researching into it, and if you got the opportunity to discuss it with a physicist, do it.

    P.S.: I pretend this comment just to point to you that your understanding is incorrect and to encourage your interest for physics.

  • Re:Second Minute (Score:1, Informative)

    by Anonymous Coward on Saturday November 20, 2004 @06:17PM (#10876867)
    Sorry, mate, but that's just not true.

    Seconds comes directly from latin and has found its way into Germanic and Slavic languages via that etymology.

    To disprove that in two seconds: Other languages use the exact same word from the same source. And they did so long before Queen Elizabeth.

    It found its way into Old English in 1391 (first occurence) from Old French as 'seconde'. The rest, hewever ist pretty accurate. The big minutes were called "prime minute" and the small ones "second minute", even in French.

    Don't mod up anything that sounds smart, people. Mods...
  • by Anonymous Coward on Saturday November 20, 2004 @06:19PM (#10876883)
    Nope, the inch is defined in terms of meters, and has been since around 1970. This also made the inch more accurate, if you want to put it like that.
  • Re:Great! (Score:1, Informative)

    by Anonymous Coward on Saturday November 20, 2004 @09:14PM (#10877946)
    First off. This new "atomic clock" is most likely not a clock at all. Rather it is a frequency standard. That's the first step in building a clock: a nice, repetitive "ticking" thing.

    So what is accuracy in this context? Primary frequency standards based upon cesium attempt to realize the definition of the second. They do so by struggling to reduce uncertainty. So, when you build a primary frequency standard, you toss a cable out into the room and say, "Here's a signal with frequency X plus or minus Y."
    Some folks call the Y the accuracy, but it's really the uncertainty.

    As far as an optical frequency standard goes, there is no definition to stand upon regarding the frequency they produce. However, you can take a hard look at all know sources of error and state their root-sum-square as the uncertainty of the frequency it produces.

    Think about that watch described above, and think of its ticking as a frequency standard and you'll see that if it is neither fast nor slow then it has good accuracy, regardless of how the "ticks are labeled."
  • by zwalters ( 532390 ) on Saturday November 20, 2004 @09:55PM (#10878140)
    Sorry for all the posts: I now really hate the "HTML formatted" box.

    The standard press description is a little confusing. A good way to think about the subject is that atomic clocks are extremely good frequency standards, which incidentally makes them good time standards as well (if I have a pendulum that oscillates once per second, I can measure time by counting the number of oscillations).

    The idea behind all atomic clocks is that atoms are very picky about the kinds of light they absorb and emit (that's how astronomers can tell what kinds of atoms make up stars). There are some frequencies of light that interact very strongly with any given kind of atom, and some frequencies where the light barely interacts at all. When the atom absorbs a photon, it jumps to a higher energy state, when it emits a photon, it jumps to a lower energy state.

    If you look carefully at the spectrum of light that an atom absorbs or emits, you'll find that the atom isn't equally picky about every kind of transition that it can make. There are some transitions (in cesium, they're called hyperfine transitions) where the atom isn't just picky, it's positively fastidious. What you'll find is that if you want to excite these transitions, you'll have to shine light that is exactly the right frequency, plus or minus a tiny amount (the "linewidth" -- literally, if you plotted absorbtion vs. frequency, the width of the peak you would see on the graph.)

    So reasoning backwards, if I'm shining a laser at a cavity of cesium atoms and I measure that they're strongly absorbing the light, then I know the frequency of the laser has to be *exactly* the frequency that excites the atom, plus or minus a tiny linewidth. So I can count the oscillations of my laser and figure out how much time has elapsed. That's basically how an atomic clock works.

    But if you wanted to get really anal about it, you could point out that I really don't know the exact frequency of my laser at all -- all I know is that it's the frequency of the atomic transition, plus or minus the linewidth. So if a hypothetical Alice and Bob in adjacent laboratories had lasers locked to the same transition, it's possible that Alice could have her laser locked at the atomic transition frequency minus a linewidth, while Bob has his locked at the atomic transition frequency plus a linewidth. (The linewidth was chosen to be really small, but it's still not zero: also, really narrow lines are hard to lock a laser to, so there's always a tradeoff involved.) So Alice and Bob's clocks will drift a tiny bit relative to each other. But because the linewidth is so small, it will take an insane number of oscillations before Bob measures that one more second has passed than Alice measures. The "1 second in 30 billion years" is just a reflection of this: it measures the linewidth of the transition relative to the frequency of light involved.

    The appeal of using Mercury or Strontium atoms (small world: one of my best friends is also working on a Strontium time standard) is that they have a special transition that is even narrower relative to the transition frequency than Cesium's hyperfine transition.
  • Re:How do they know? (Score:3, Informative)

    by Detritus ( 11846 ) on Sunday November 21, 2004 @01:57AM (#10879210) Homepage
    Hydrogen masers have better short-term stability than cesium frequency standards, so one can compare the two and measure the short-term variation in the frequency of the cesium standard.

    Clocks can also be run in groups. With some mathematics, the group can produce a result that is more accurate than a single clock.

    If you have a detailed knowledge of the physics involved in the operation of a clock, the possible sources of error can be modeled and predicted.

  • Re:Compared to what? (Score:3, Informative)

    by Detritus ( 11846 ) on Sunday November 21, 2004 @02:07AM (#10879263) Homepage
    The Earth is a lousy time standard. The international atomic time scale (TAI) does not have leap seconds and is not synchronized to the movement or rotation of the Earth. Civil time (UTC) has leap seconds to keep it synchronized with the Earth's rotation. This is for the convenience of people who use it for navigation.

Lots of folks confuse bad management with destiny. -- Frank Hubbard

Working...