Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Encryption Security Technology

Using Lasers To Generate Random Numbers Faster 149

Pranav writes "Using semiconductor lasers, scientists from Takushoku University, Saitama University, and NTT Corporation achieved random number rates of up to 1.7 gigabits per second, which is about 10 times higher than the second-best rate, produced using a physical phenomenon. Future work may center on devising laser schemes that can achieving rates as high as 10 Gbps."
This discussion has been archived. No new comments can be posted.

Using Lasers To Generate Random Numbers Faster

Comments Filter:
  • by Wrath0fb0b ( 302444 ) on Sunday December 28, 2008 @04:32PM (#26251689)

    Has anyone out there actually had their system bottlenecked by lack of random numbers? I had thought that the bottleneck in serving large amounts of SSL content was processing the asymmetric part of the cyrpto -- hence the need for SSL accelerator cards. It's a nice invention and a creative application of physical process, but I really want to see just one case where this would be lead to a substantial benefit.

    As an aside, computer simulations always use pseudoRNGs like the Mersenne Twister[1]. For a reasonable exponent (I use 19937 in my simulations), this results in a period > 10^6000 and virtually no correlations between adjacent calls. The notion of a computational physicist using a real physical RNG is laughable.

    [1] http://en.wikipedia.org/wiki/Mersenne_twister [wikipedia.org]

  • by hweimer ( 709734 ) on Sunday December 28, 2008 @04:52PM (#26251803) Homepage

    Has anyone out there actually had their system bottlenecked by lack of random numbers?

    I know some guys doing quantum Monte Carlo simulations. And yes, fast RNGs are crucial for their algorithms.

  • by Frequency Domain ( 601421 ) on Sunday December 28, 2008 @04:57PM (#26251847)
    First off, this is old news -- the article is copyright 2007.

    Next, the article claims...

    Generating random numbers using physical sources -- which can be as simple as coin-flipping and tossing dice -- are preferred over other methods, such as computer generation, because they yield nearly ideal random numbers: those that are unpredictable, unreproducible, and statistically unbiased.

    This is garbage -- there are applications where people prefer physical sources, but those of us doing simulation work realized long ago that good algorithmic sources are far better for our needs: 1) It's mighty hard to debug a complex simulation model without reproducibility; 2) You can use the reproducibility to induce covariance between runs, greatly reducing the standard error of your estimates for a given sampling effort; 3) The distributions of algorithmically generated pseudo-random numbers are provably uniform, whereas for physical sources the best you know is that they haven't (yet) failed a hypothesis test for uniformity. Finally, the last statement about being "statistically unbiased" is utter nonsense -- unbiasedness is a property of an estimator, not a distribution.

  • by Wrath0fb0b ( 302444 ) on Sunday December 28, 2008 @05:58PM (#26252317)

    Actually, I know quite a bit about (stochastic*) computational physics and the notion that "repeatable" means "can run the exact same simulation with the exact same seed and get the exact same result" is absolutely incorrect. What is meant by "repeatable" is that one can extract from the simulations some sort of macroscopic quantity (usually a thermodynamic quantity or a correlation function) whose average is consistent across many separate runs (known in the biz as the ensemble average). So, for instance, if I'm observing the coalescence of polymers into a hex-phase (as in [1]), I could measure the average number of aggregated copolymer blocks and compare those (as was done in that paper).

    Let's make an extended gambling analogy. Suppose I have a new roulette table that I want to certify that it works like it should. One suggestion (akin to what you said), would be to put the entire table under the same initial conditions as a known-good table and see if it gives the same results. A more sophisticated approach would be to make a histogram of results for a large number of independent roles and see if it converges to the proper distribution (or, in case the distribution isn't known theoretically, compare it to the distribution from a different device, also tested a large number of times). I would argue that the second method is much more powerful than the first, because it probes a more relevant value. Nobody cares whether the roulette table gave 00 the first time and 23 the second time -- we are only concerned that, on average, it gives 00 with the same probability as 23.

    In stochastic computational simulations, the same story applies. Nobody cares whether a particular simulation did X or Y or Z because that's not relevant. What is relevant is the (converged) probability that, given some starting condition, the systems ends up in X or Y or Z.

    * None of these comments apply in any way to solving deterministic systems. You don't need random numbers for those anyway.

    ** Another commenter pointed out that exact repeatability is incredibly useful for debugging purposes. That is true but that has nothing to do with reproducibility in the scientific sense of the word.

    [1] http://link.aip.org/link/?JCPSA6/128/184906/1 [aip.org]

  • by ZombieWomble ( 893157 ) on Sunday December 28, 2008 @06:58PM (#26252755)
    While slashdot is often not on the bleeding edge, this news isn't exactly ancient: the article itself is dated just last week, and correctly cites a paper which was only published a month ago. Don't believe everything you read in a copyright tag.

    As for the rest of it, yes, much of the article is rather terrible.

  • by elashish14 ( 1302231 ) <profcalc4@nOsPAm.gmail.com> on Sunday December 28, 2008 @07:26PM (#26252967)
    Not true at all, MC is the best method for doing integration in a multi-dimensional space. My research team used it a lot and it's nearly impossible without a good RNG.
  • by Wrath0fb0b ( 302444 ) on Sunday December 28, 2008 @11:52PM (#26254519)

    A roulette ball is quite large enough (by many orders of magnitude) to treat as a purely classical particle.

    You are right about one thing -- time to brush up on your Quantum. Start by calculating the de Broglie wavelength (the relevant QM length) of a roulette ball traveling at the maximum speed you might see at a casino and compare it to the radius of the ball itself.

Living on Earth may be expensive, but it includes an annual free trip around the Sun.

Working...