Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Technology Hardware

A Look at Photonic Clocking 130

zymano writes "In an article on the Electronic Engineering Times site James Siepmann shares a few thoughts about Photonic Clocking. Siepmann states: 'Copper interconnects are reaching their limit as data-transmission bandwidth and processor speed continue to rise. [..] Photonic clocking not only solves the limitations of electronic clocking, but also reduces jitter, skew, delay, crosstalk and power consumption while maintaining clock signal integrity for longer distances.'" Are Photonic Processors the next logical step, or will the almighty buck shuffle them aside because of cost?
This discussion has been archived. No new comments can be posted.

A Look at Photonic Clocking

Comments Filter:
  • by Anonymous Coward on Saturday September 17, 2005 @09:01PM (#13587636)
    Are Photonic Processors the next logical step, or will the almighty buck shuffle them aside because of cost?

    Yeah, 'cause technology never gets cheaper. Hey, I've got an AT&T 8086 PC with a lovely green monitor that you can have for $5000, if you act now...
    • Ooooo! (Score:4, Funny)

      by Saeed al-Sahaf ( 665390 ) on Saturday September 17, 2005 @09:04PM (#13587648) Homepage
      Yeah, 'cause technology never gets cheaper. Hey, I've got an AT&T 8086 PC with a lovely green monitor that you can have for $5000, if you act now...

      Hang on to it. In a few years, you can haul it down to Antiques Roadshow and have 'em tell you it's "worth between $2000 and $4000, but for insurance purposes..."


      • I hate to comment on this but feel compelled due to boredom. WTF is with these antique shows?

        I suspect that they edit the parts where the "evaluator" says, "this is just shit from your grandma's basement". And yet people apparently gobble this stuff up like wannabe lottery winners.

        Knowing current TV production tactics, it wouldn't surprise me that multiple takes are done to get the "perfect" reaction shot.

        FYI, the "reaction shot" is the 21st century's version of the laugh track.
    • Hell, I've got an AT&T 8086 with a COLOUR monitor for $4,800. Plus S&H and all applicable taxes.
  • by Anonymous Coward
    " Are Photonic Processors the next logical step, or will the almighty buck shuffle them aside because of cost?"

    You tell us. Ultimately it's your present purchases that will fund it.

    --
    The "are you a script" word for today is rubbers.
  • by Anonymous Coward
    Aren't there like 5 or 6 orders of magnitude between optical frequencies and current clock rates? And last I checked, copper waveguide does
    just fine for at least 3 or 4 of those. So how is it that we are "reaching the limitations" of copper?
    • by goombah99 ( 560566 ) on Saturday September 17, 2005 @10:04PM (#13587821)
      Distributing your clock with photons imples you have a photon wave guide. If you are going to build a photon wave guide then why not build an electrical wave guide. Electrical wave guides, like for example coax cable, have wave velocities that are faster than light in glass, so they would logically be even better. And you dont' need any special materials like you would for optical wave guides.

      The problem might be that usually wave guides have to be the size of the wavelength to work right. ghz wavelength are larger than the chip. Thus you get forced towards the optical region by this considerarion.

      But you can beat this two ways.

      1) use negative index of refraction materials. Then the waveguide can be smaller than the wave length

      2) use near field waveguides with amplification. When the wavelength is a lot larger than the waveguide then the wave becomes evanscent (decaying). So it can't propagate very far. But hey, that's okay because the chip is not very wide either, so we can tolerate some loss of signal. And we could toss in some amplification to offset it.

      • by half_d ( 314945 ) on Saturday September 17, 2005 @10:57PM (#13587939)
        Photonic clocking not only solves the limitations of electronic clocking,
        but also reduces jitter, skew, delay, crosstalk and power consumption while maintaining clock signal integrity for longer distances
        So if I get one of these photonic alarm clocks, will it make me not feel so terribly jittery and skew in the morning? (Don't even get me started on the delay)
      • by Vireo ( 190514 ) on Sunday September 18, 2005 @12:23AM (#13588183)
        Electrical wave guides, like for example coax cable, have wave velocities that are faster than light in glass, so they would logically be even better.


        Err, actual wave speed never was a problem. Electromagnetic force already propagates at the speed of light in an electric conductor.

        It's the modulation speed (e.g. how fast you can vary the signal inside the channel) that is much higher in photonic devices. In conductors, losses are very high for rapidly varying signals, and as you said it, microwave guides are much too large for chips. Evanescent fields are also a problem since they can spread very far from small guides.

        This is even true when you light: for example, a standard coupler ("Y") for visible or IR wavelengths must normally be several centimeters long. However, so-called photonic-bandgap devices are solving this problem.
        • I beg to difffer. if the modulation is terrible then its not a well terminated wave guide. period. a wire does not constitute a waveguide.
          • Re:I disagree (Score:3, Informative)

            erm ... a waveguide is a waveguide, no matter what kind of terminators you use. The pertinent condition is to support propagation modes.
            • But in the interests of discsussion: Okay let me rephrase my point using your termniology: a "wire" is a "waveguide" that supports too few modes or is too dispersive to support clean modulation. Furthermore if the impedance is varies quickly with frequency (or is otherwise dispersive) then the end termination will be essentially impossible to match. You will get ringing that will prevent clean modulation. Hence termination prevents clean modulation. In my experience it's really bad termination that l
        • Err, actual wave speed never was a problem. Electromagnetic force already propagates at the speed of light in an electric conductor.

          No, it doesn't. I'll assume when you say "speed of light" you intend the common shorthand for "speed of light in a vacuum". A signal in an electrical conductor propagates at a substantial fraction of "the speed of light", somewhere in the 40%-60% range IIRC. And even light itself travels more slowly than "the speed of light" through any medium other than a vacuum, I think abou
          • So you first assume I meant speed of ligth in vacuum, and then infer from that that I know shit about physics and bitch about how I'm moderated. The problem is, as this is Slashdot and not an engineering department, I tought is wasn't necessary to write a white paper on transmission lines and waveguides as an introduction.

            For the record, the whole point of my post was that the speed of light (be it in the vacuum or in a dielectric medium, or the propagation speed inside a waveguide) never was the limiting f
            • It may cause delay problems on large distances (which isn't related), but I doubt that chipmakers are bitching about the slowness of c (then again I'll admit I know nothing about chip design).

              So, what do we have so far:

              1. You make a factually incorrect statement about physics (electrical signals propagate through conductors at the speed of light) and a factually incorrect statement about chip design (that propagation speed of clock signals is not a problem).
              2. I call you on the physics error only because I thou
  • by Doug Coulter ( 754128 ) on Saturday September 17, 2005 @09:10PM (#13587662) Homepage
    Nice thing about a pulse of light is that it can be made to reach lots of places at the same time, or nearly so. Just a normal burst of light from a point source has a spherical wavefront, but this can be modified by optics in various ways. Having designed plenty of really fast stuff and having had to deal with skew problems, I can see the advantadge, if real use can be made of it. I think it might even be possible on silicon, which would be required for quick adoption -- after all, the LSI only has to receive, the clock light source can be made of anything. Making a hybrid of course drives costs way up, though. but at current profit margins for fast cpu's this may not be much of a real issue.
    • Not to mention... (Score:4, Insightful)

      by Sensible Clod ( 771142 ) on Saturday September 17, 2005 @09:57PM (#13587803) Homepage
      clockless CPU's, which of course--wait for it--don't need a clock. (I realize that other system parts may still need it.) Every once in a while, I hear a tiny thing about clockless chips, but it seems like the Big 2 don't want anything said about them.

      Reminder: this comment is on topic.
      • by Lehk228 ( 705449 )
        i doubt that clockless chips are not talked about due to any sort of collusion, but rather the nasty hurdles in programming especially at a low level, with clockless chips there is no more 'tick' no purely step-by-step execution and simultanious execution of complex code is nasty especially trying to work out timing of code when there is no universal timer across the chip.
        • At the chip level of gates, designers usually aren't executing sequences of instructions (except in microcode). VHDL and VeriLog programming is "simultaneous", accounting for propagation delays. Those programmers operate on clock cycles and states, but each clock cycle can do many simultaneous operations. Sync'ed to the clock for correct delivery of multistate results to other circuits at the right time. So those familiar programming environments, which use clocks, are not bound to them the same way "assemb
      • Uh, (Score:5, Informative)

        by autopr0n ( 534291 ) on Saturday September 17, 2005 @10:17PM (#13587845) Homepage Journal
        Transistors don't need clocks, logic gates don't need clocks, but flip-flops do. The reason you need a clock is because the outputs of a bit of logic will be 'unstable' for a while the result is computed. The clock tells the next piece of the system when to read. In place of that, you'd need a 'done' signal, which would rase transistor counts quite a bit. Not to mention it would be very hard to find people who would know how to design these things. I think the future of the CPU involves different parts of the system operating on separate clocks, transferring data via a 'networking' type system. Computers connected via Ethernet don't need to have their clocks synched in order to work. Think of a simple instruction decoder. The decoder reads the instructions, and opens the right 'gates' in the CPU so that there is an electrical connection between the two registers and the ALU, and inside the ALU to the adder or subtractor, or whatever depending on what instruction you're trying to run. Then, the clock signals and tells the ALU that the registers are ready. Without the clock, the ALU might try to add the wrong things. (the ALU doesn't need a clock to work) In the future you could have some sort of system where the decoder just sends a message to the ALU telling it to setup the adder, and to the registry file to access these two registers. Then the register file will send the data to the ALU whenever it's ready.
        • Re:Uh, (Score:1, Informative)

          by Anonymous Coward
          It's been done. Honeywell 6180 from the 1970s. It used these "done" signals at every stage of the processor. There are people out there who can design and build these kinds of systems. Sure, there are more transistor counts, but, we seem to have no problem squeezing more and more transistors onto a chip...
        • Re:Uh, Yes They Do (Score:4, Informative)

          by nathanh ( 1214 ) on Saturday September 17, 2005 @11:42PM (#13588079) Homepage
          Computers connected via Ethernet don't need to have their clocks synched in order to work.

          Ethernet does rely on synchronised clocks. You might be misstating that Ethernet doesn't have a clock line, meaning there is no dedicated wire with a clock signal on it.

          There is a high-precision clock on every Ethernet card. An Ethernet frame has a 64-bit preamble with Manchester encoding. That preamble adjusts the skew of the receiver clock so that it's synchronised with the transmitter clock. If the synchronisation didn't occur, you wouldn't know when to latch the data on the line and you couldn't receive a frame. The synchronisation occurs on every Ethernet frame and the precision of the clock must be high enough that the synchronisation lasts for the length of a frame.

          Async architectures will likely use a similar technique. The subsystems won't be driven by a system-wide clock line, as in the existing synchronous architectures, but the various clocks in subsystems will certainly be synchronised.

          • Yes and faster ethernet cards have faster clocks because according to relatively theory time slows down if you go faster. Thus, a fast card needs a faster clock to compensate for the slower time.
            • Yaknow, I would mod you funny, but I think you're serious, and that scares me.

              Hint: Relativity has nothing to do with why faster ethernet cards might use a faster clock. Think "Frequency" instead.

              --S
        • Re:Uh, (Score:1, Interesting)

          by snotclot ( 836055 )
          dude i didn't know the autopr0n guy knew some EE. props to you. thought you were high-level cs guy. btw, whatever happened to your server.
        • Few years ago, I read the book "Asynchronous Circuit Design" by Chris J. Myers, in which it is explained the basics for designing digital asych logic by using special techniques and gaining advantages in power consumption and systems speed and, to stay on topic, no clock to be propagated through the silicin at all.

          The real questions are:
          - why these tecniques never gained importance (the first studies are from S.Unger in 1969)?
          - what about the EDA industry?
          • The real questions are:
            - why these tecniques never gained importance (the first studies are from S.Unger in 1969)?

            Probably because at all points through history the chip companies had other, cheaper techniques to get their next generation of chips out of the door, without getting down in costly research that had to make them redesign everything...

            - what about the EDA industry?

            Probably because they reasoned their customers went with the previous answer...

          • Part of problem may be sheer complexity of Asynchronous State Machines (ASM). Basic function of a clock in any state machine is to limit the state transition space. For anything other than a rather simplistic systems (few states), run away state transitions occurs in ASM. Nature seems to have solved this problem with analog systems on a macro level. On a quantum level, what may possibly look like an ASM, is so bizarre (like tunnel diodes) that anything exists at all.

            At current level of understanding, this

      • Re:Not to mention... (Score:1, Interesting)

        by Anonymous Coward
        The reason you don't see asynchronous chips is that it's damn near impossible to guarantee that they work.

        With standard clocked CPUs, you can draw a box around the chip, or part of it, and treat it like a black box. Signal in, signal out, clock: change any and all of these to guarantee/profile your behavior. That doesn't work on async design. There's no way you could test it all and have any confidence that you did it right.
      • Clockless logic is far, far more complex to design. That is all that is holding it back, not some mysterious pointless collusion.
      • Don't kid yourself. At least one of the big two has published papers on applications of asynchronous design. In 1997 or 98 at ITSW, Marly Roncken of Intel presented on the testability of asychronous circuits related to cache logic. Clock-less logic turns out to be rather difficult to test, it turns out.
    • Yeah, because light travels at ... drumroll ... the SPEED OF LIGHT! Which is, like, really fast.

      Whereas an electrical pulse on a copper wire travels at ... nearly the speed of light, which is... also really fast.

      Wait. What's the advantage of photonics?
      • less heat, less energy consumption, potential for smaller pathways, and higher speed (by definition thanks to the tidbit of information you shared with us just there.)

        to name a few...
        • less heat, less energy consumption, potential for smaller pathways, and higher speed

          Indeed. You've hit the nail on the head. Of course, the majority of jokesters around here will probably continue to ridicule the concept in other threads, mostly because they don't know what they're talking about. The difference between current processor technology and photonic processors is the same as the difference between copper wire network connections and fiberoptic connections.

          Fiberoptics are superior to purely

          • I think some of us might be sceptical about the idea because we remember seeing it on Tomorrow's World about 10 years ago, when it was being touted as almost ready for mainstream. The big advantage they were stating then was a much smaller die size, since photon paths could cross, while electron paths can not.
            • The idea certainly isn't new but the field of fiberoptics has come a long way in the last ten years. With advances in material design and more experience with the behavior of light and modulation of a light signal maybe we finally have the real technology to make this work.

              da Vinci designed a flying machine in 1500, some 400 years before the Wright brothers took off.
  • What? (Score:5, Insightful)

    by scheme ( 19778 ) on Saturday September 17, 2005 @09:12PM (#13587671)
    Are Photonic Processors the next logical step, or will the almighty buck shuffle them aside because of cost?

    If photonic processors go into widespread usage, it will probably be because of the almighty buck and companies deciding that they can make more of it by producing photonic processors.

    Profits and competition are the main reason for a lot of the recent advances in processor performance. Look at the processor introductions back when 486 and pentium processors were around and Intel didn't have any credible competition.

    • Not a credible player.

      Back in the day, "real" computer manufacturers scoffed at Intel. IBM would only let them produce the chips for the PC after Intel found another manufacturer willing to produce the part in case Intel tanked. The PC was nothing to boast about compared to the mainframes of the day.

      Slowly but surely, Intel grew to become the monster they are today. The turning point was somewhere near the Pentium II, when Intel machines were beginning to be used as engineering workstations. Profits tru
      • At that time, nobody would buy any chips from anyone (Intel, TI, National, etc.) without a second source. It wasn't just in case the manufacturer tanked, it helped if they couldn't meet production quotas or had a fab go non-operational for a long time of course, either of these happening too much would cause you to tank, but while the high density IC market was starting off, manufacturing problems were the norm.
    • Profits and competition are the main reason for a lot of the recent advances in processor performance. Look at the processor introductions back when 486 and pentium processors were around and Intel didn't have any credible competition.

      Wow. Only on Slashdot would THIS be considered insightful.

      There were *MORE* options for chips back then, not less. MIPS. Alpha. Power. 680x0. AMD 486. Cx 486.

      No innovation here... (!)

      Intel has pretty much always "pushed it". Their projected timelines for Mhz improvement has be
      • There were *MORE* options for chips back then, not less. MIPS. Alpha. Power. 680x0. AMD 486. Cx 486.

        In the consumer market you had a choice between Intel and it's clones (Cyrix, IDT, AMD). By the time the Pentium and PII processors were around only Intel offered reasonable performance. Alpha, power, mips weren't an option for consumer systems. The 680x0 systems were only used for apple systems.

        The situation now is about the same, for consumer pcs you have a choice between power and intel/amd. The d


    • There is no reason for much faster processing outside of specialty applications. The only possible use (for the average computer owner) is digitizing video more fastly.

      Then again, marketing can be an amazing thing and convince people that they NEED something that they really don't. This type of hype will be going down the tubes as more and more people discover that what's being tossed at them is just more blinky lights and fancy buzzwords.
  • by MegaFur ( 79453 ) <wyrd0.komy@zzn@com> on Saturday September 17, 2005 @09:13PM (#13587676) Journal

    Are Photonic Processors the next logical step, or will the almighty buck shuffle them aside because of cost?

    This makes it sound like "the almighty buck" is the bad guy. I think this is one of those times when that's not the case. If fully photonic processors turn out to work best, then that's what we'll see. If they're not, and if the article claim that copper interconnects are reaching their limit is true, then we'll seem some hybrids. Rock on.

    This whole article seems like an attempt to pad out a slow news day. Maybe we can turn this article into something useful, or at least more entertaining. We could start a flamewar! Yeah!


    ----------

    <enganging fake troll mode>It's gotta be Photonic chips all the way man!!! Copper procs, yer all gonna burn in silicon hell!! yeah, burn baby burn! I unleash light-based clocking on all you 1nf1d3l5!!

    (etc)

  • by Steve525 ( 236741 ) on Saturday September 17, 2005 @09:32PM (#13587723)
    The article didn't say a whole lot, did it? It just said, "Gee, wouldn't photonic clocking be nice". It didn't say a whole lot about how, and whether it's feasible.

    So, I'll quickly fill in what I know. To do clock distribution you need two types of components: waveguides and detectors. Let's assume you are going to work in silicon...

    Waveguides function as the optical wiring, and includes things like bends and splitters. Although perhaps not trivial, it is relative straight-forward to make waveguides in or on silicon. Detectors, on the other hand, are not so easy, at least at the wavelength most people are interested in, 1550 nm. There's a number of people researching Ge growth for detectors on Si, and this does have promise, but it's not ready yet. Another option would be bonding InGaAs, but that might always be too expensive.

    Now, if you want to do full up optical communication, on chip, you'll want modulators, too. These have been demonstrated by Intel and Cornell in silicon, but only at speeds around 1 Ghz. Optical amplifiers would be nice, too, and this has been demonstrated (using Raman amplification) by Intel and UCLA. (I'm not sure Raman amplification can give you the sorts of amplification and efficiency you really need, though).

    (Sorry, I won't be able to respond to any replies; at least not until Monday. I'm off to bed and I'm not planning to be near a computer tomorrow).
    • Detectors, on the other hand, are not so easy, at least at the wavelength most people are interested in, 1550 nm.

      Too true. However, this page [photonics.com] says LightTime LLC, whose Chief Research Officer wrote the article being discussed, is working with mode-locked lasers centered at 860 nanometers. That's a piece of cake for silicon to detect (although making those lasers cheap, reliable, and phase-lockable will be a nice trick.)

      • > (although making those lasers cheap, reliable, and phase-lockable will be a nice trick.

        I did some research on this about 10 years ago which has come a long way since. Here's [ucl.ac.uk] the first paper in bulk optics. We created both homodyne and heterodyne phase lock loops with two semiconductor lasers. In the heterodyne case we could lock with up to 24 GHz frequency offset.

        A lot of the original work for this was done at UCL, check out [ucl.ac.uk] their list of papers looking for 'phase-lock loop'. I believe they have al

    • I'd say Semiconductor Optical Amplifiers (SOA, google for it) are better candidates for the amplification. And they are already at 40GBit/s, but again only on InP or GaAs basis.
  • by mikael ( 484 ) on Saturday September 17, 2005 @09:36PM (#13587737)
    Assuming that the clock circuity takes up 30% of the chip, wouldn't manufacturing a chip with both photonic and electrical circuitry be more expensive than just manufacturing a purely photonic chip?
    • One word: how?

      Seriously, while it is possible to do optical switching, methods either are inefficient or need hugely powerful light sources. There is some research to alleviate the problem _in Si_, but it's still only research, years away from practical use. As it is, I think that using optics only for interconnects is an already doable way of improving performance, and not just within the chip. Content associative memories (which can be built with SLMs, passive optics and cameras) can greatly help tasks li
  • Hey, now we can use this for our 600GHz 2TB laptop!
  • by postbigbang ( 761081 ) on Saturday September 17, 2005 @10:30PM (#13587873)
    Consider the semiconductor. The way we work today is based on binary elemental logic-- on, off, unknown/neutral. Your basic light switch (SPST) is your basic computer, but it can't count too well.

    The evolved state of computing uses Boolean logic to mosh states together into integer algebraic, then other kinds of math transformations.

    Now, consider what light does, and how it flows. Light (actually this segment of the electromagnetic spectra) has different frequencies, at about the same data rate depending on media. No information there, except frequency differences and blendings of frequencies... lambda moshing.

    You can modulate light, like any other electromagnetic phenomena. You can modulate information, therefore, onto light. It's done all the time. By adding information, you can blend things together, then demodulate them to see what happened as the change in information. This modulation mimics how ALUs/accumulators/CPUs work with logic states in some ways, but now we have to multiply the effect to get to significant digits and significant logic handling-- math by light modulation and the devices that can do that. But not densely, so far, in the calculative/logic-state change tracking sense.

    What of these devices-- aye, thar's the rub. Is there an advantage to using light to do math? Not yet, really. It doesn't meet the state change efficiency model. One day, it might. Today, we lack the ability to make things dense enough. That's why photonic logic may fall short of expectations.
    • if i'm reading this properly light based circuits could be decimal based, which may be an advantage if it's possible. with decimal based architecture: 1 - less hardware can store higher values (greater capacity per memory cell). 2 - comp sci will be MUCH easier and more intuitive =) 3 - you could have the processors/circuitry operating on the visible spectrum, offering the most awesome looking case mods imaginable! YARR!
      • 1) No, you're not reading it right. There was no literal use of "digits" meant; the stuff about "significant digits" had to do with quantifying amounts of information.
        2) Your other points are wrong, wrong, and incredibly silly, in order. Working with decimal would be an order of magnitude harder than binary, not easier. It's been tried, a long time ago ;) Ternary, on the other hand, might still have some hope.
        • I understood what you were talking about with significant digits.. I remember those architecture lectures.. but i was musing in a slightly different direction. "Working with decimal would be an order of magnitude harder than binary, not easier. " never said it would be easier.. i was musing that it might be more efficient. I'd also like to say that flying was tried a long long time before the wright flyer took to the sky... the analytical engine designs by babbage were a century ahead of time.. who says
          • This is done all the time. You muse correctly, it is done as an abstraction ontop of binary. Google for "BCD" (binary coded decimal).
          • You said:

            [I] never said it [decimal based architecture] would be easier.. i was musing that it might be more efficient.

            No, it wouldn't be more efficient, either. The most efficient base for numerical representation is e. You know, good ol' 2.718281828...

            The farther you get from this number, the less efficient your number system is. Since it's kind of hard to represent numbers in base e with anything other than an analog circuit, this means that base 3 is the best you can do. Not base 2, and certainly n

      • if i'm reading this properly light based circuits could be decimal based, which may be an advantage if it's possible. with decimal based architecture: 1 - less hardware can store higher values (greater capacity per memory cell). 2 - comp sci will be MUCH easier and more intuitive =) 3 - you could have the processors/circuitry operating on the visible spectrum, offering the most awesome looking case mods imaginable! YARR!

        Regarding these points in turn
        1. Base three is better (base e is ideal, but being a
    • Sorry to parent, but people seem to be taking this seriously so I gotta point out that this is BS so hopefully noone takes this seriously...

      "Consider the semiconductor."
      Ok, here is the parent posts first fundamental misconception. Digital doesn't necessarily mean semiconductor. Say, for example CDs which encode digital data using light.

      "By adding information, you can blend things together, then demodulate them to see what happened as the change in information."
      that isn't how light works, the waves superpose
      • No kidding... the GP looks like it was written by one of those random text generators.
      • Re:mod parent down (Score:3, Informative)

        by postbigbang ( 761081 )
        >>Sorry to parent, but people seem to be taking this seriously so I gotta point out that this is BS so hopefully noone takes this seriously...

        Oh? Read on.

        >"Consider the semiconductor."
        >>Ok, here is the parent posts first fundamental misconception. Digital doesn't necessarily mean semiconductor. Say, for example CDs which encode digital data using light.

        No, you misconstrue it. Transistor logic is what's used to do state changes that amount to the various relationships that form what a CPU does
  • by Dark Coder ( 66759 ) on Saturday September 17, 2005 @10:33PM (#13587881)
    LaForge: Gee, I don't know... These RocketIO bus are mighty archiac and PCI even more so.

    Data: RocketIO is rated at 9.8 Gbps and PCI-X6 is rated at even slower rate of 8 Gbps aggregated.

    LaForge: Yeah, right. Nothing compare to our photonic bus of 980 THz over each of the 2^1024 channels.

    Scotty: Ayie! Why don't they get with the program, laddies? I kin nev'r understind them, bloody buses.
  • by Call Me Black Cloud ( 616282 ) on Saturday September 17, 2005 @10:49PM (#13587922)

    The "almighty buck" won't shuffle it aside...people will. If the processor is not cost-effective then it won't catch on. However, the processor may become the SUV of chips. It may not be the most cost-effective solution but a purchaser may feel it helps him compensate for his undersized penis.

    You know, when I started writing this post that's honestly not where I was going. I was going to make some point about how marketing may overcome the possible lack of value of the chip (a la VHS vs Beta) but then the post just headed south...
  • by Anonymous Coward
    These look cool, they're supposed to oscillate to the several 10's of gigahertz.
    http://physicsweb.org/articles/news/9/9/9/1 [physicsweb.org]
    • by Anonymous Coward
      Physicists in the US have shown that two nano-scale magnets can be made to oscillate in phase when they are positioned close to one another. The phenomenon, which is similar to the way two pendulum clocks mounted on the same wall become synchronized via the weak coupling of acoustic signals, produces a stable microwave output. It could therefore replace bulky and expensive components which operate on the same "phase-locking" principle in devices such as portable phones and radar systems. The magnets may als
  • Design (Score:3, Insightful)

    by SilverspurG ( 844751 ) * on Saturday September 17, 2005 @11:30PM (#13588040) Homepage Journal
    A processor, greatly simplified, is a collection of logic gates. These logic gates, greatly simplified, are nothing more than modulators. In hardware design, the modulation of the electrical signal indicates the result of the logical function of the circuit. Electrical impulses are measured in cycles/second.

    Photons can achieve frequencies in vast excess of current processor speeds. The function of a photonic logic gate would be measured by simple amplitude modulation. A photon has a frequency and an amplitude. Using a photon with the energy of a gamma ray would be _FAST_, have negligable heat loss due to the friction which plagues electronic processors, and the amplitude of the photon could be easily modulated by passing through different materials. Different materials of different refractive indeces and transparencies (see fiberoptics) would be the photonic equivalent of electronic resistors and capacitors.

    I can only wait for the development of photonic processors. :)
    • A photon has a frequency and an amplitude

      A photon has a wavefunction, that's it. Assuming the photon has a relatively sharp energy, you can then say it has a frequency (energy) and a phase. Amplitude is meaningless when speaking about a single photon.

      I can only wait for the development of photonic processors.

      You'll be waiting a long time, I think. People have been trying this foolhardy idea of optical computing for a long time, and the field is pretty much dead. The problem is that to get light to modu

      • A photon has a wavefunction, that's it.

        All particles have wavefunctions, even electrons. It's the derivation of the math for the particle-in-a-box which yields that wavefunction. I suppose you'll probably continue to argue until I pull "Quantum Chemistry and Molecular Spectroscopy" off the shelf and start quoting lines and lines from the derivation.

        Amplitude is meaningless when speaking about a single photon.

        This is about photonic computing, not quantum computing. Just as in fiberoptics we're not t

      • You'll be waiting a long time, I think. People have been trying this foolhardy idea of optical computing for a long time, and the field is pretty much dead.

        The field isn't so much dead. What is "dead in the water" (at least for now, and as long as there is no further breaktrough) is the idea of building processors using photons the sale way as the current processors are using electrons.

        However, optical processing has other applications that could be put into good use. Computing Fourier transforms like
        • I agree with you. I should've been more explicit and stated that I was talking about general purpose digital computing with optics. Having said that, even the Fourier transform stuff that once looked so promising really doesn't look so good. How do you get the data in and out of the system? In the end, you need a spatial light modulator, and those are so damn slow that you can always compute a fourier transform faster on a DSP chip, despite the parallelism of light.
    • Heh, those led computer cases are gonna be heavy
  • At first... (Score:3, Funny)

    by Hangin10 ( 704729 ) on Saturday September 17, 2005 @11:30PM (#13588041)
    I thought the article said "platonic" clocking... I was thinking... I would hope they loved their clock..

    Note to self: Don't read Slashdot too late at night..
  • No, really. As soon as it gets anywhere near a point where there's a large-enough market for it, it'll be sold. Witness the present collection of chips - do you *really think that the majority of people using computers need a P4 at >3 GHz? No, they don't. The minute the niche market of gamers (and, yes, it'll be gamers) who can afford it is large enough, it'll hit the market.
    • I have a dual 2.7 ghz system with 2 gigs of ram and an excellent video card, but my system still gets over taxed (and i don't game).

      Until new architectures cease to offer even slightly significant increases to performance and response time, I will continue to buy new systems with more power every couple years.
  • Black hole clocking would be WAY more affordable. You get data processed before it's even fabricated! Think of the advantages in CS! Lastly, in the case that black holes let you go back in time, then we will already have black-hole processors!!

    AMD will also sue Intel now, in fear that the last part comes true.

  • by zrk ( 64468 ) <spam-from-slashdotNO@SPAMackthud.net> on Sunday September 18, 2005 @12:20AM (#13588174) Homepage
    And move on to Photonic CANNONing [startrek.com]. After all, the Borg are gonna hit is one of these days, and not just through Bill
  • Quantum First (Score:3, Interesting)

    by illumina+us ( 615188 ) on Sunday September 18, 2005 @12:59AM (#13588276) Homepage
    I have a feeling that quantum computing will happen before photonic computing. That's just me though...
  • Let's see here... 100Ghz or 100,000,000,000 c/s at the speed of light 299,792,458 m/s ... gives us about 3cm the signal can travel without breaking the laws of physics.

    Hope they've figured out how to make things really really small also.
    • Yes ! Really small things in photonics are being researched. However, as your structure size will always be of the order of the wavelength, actual structures will never be so small as in 45nm CMOS and beyond. But anyway, it's the degree of functions per unit surface that counts, not how small your smallest structures are.

You scratch my tape, and I'll scratch yours.

Working...