
MIT, Nanovation to Partner on Photonic Research 86
Tirisfal writes "The Massachusetts Institute of Technology and Nanovation Technologies Inc. today announced plans to establish a world-class center dedicated to the research and prototyping of photonic technologies, a 21st-century field that will make communications hundreds of times faster. Check out the press release here."
Couple thoughts... (Score:1)
2. This is not Quantum computing (thats the future too, just far out and weird). It is MEMS.
3. See a real live MEMS here
http://www.memsrus.com/cronos/figs/MOEMSfs.pdf
4. Investors: a new industry is being created -- it will change the world, again.
5. 15 year-olds who think the internet is a legacy wall -- MEMS (in the 60s it was "plastics").
For the past 50 years weve been shrinking electronic devices. Now we begin shrinking mechanical devices. Combine that with light and mirrors and the world looks new and diffrent: Star Wars projectile 3D holograms, light-based computers, space propulsion devices.. use your imagination its a new world upon us.
Re:hundreds of times faster... (Score:1)
That was excessively pedantic. Please refer to the following excerpt from the definition of "fast" in Merriam-Webster's Collegiate Dictionary.
3 a : characterized by quick motion, operation, or effect: (1) : moving or able to move rapidly : SWIFT (2) : taking a comparatively short time (3) : imparting quickness of motion (4) : accomplished quickly (5) : agile of mind; especially : quick to learn [Emphasis mine]
In case you've been under a rock for the past five years, the word "faster," in the sense of the italicized definition above, is commonly used to refer to higher bandwidth, because transferring data takes a comparatively short time with high bandwidth connections. For example, read the sentence, "My Internet connection at work is faster than my connection at home." If you think this is improper use of the word faster, you are deluded. It's not like the poster said the signal propogates faster.
Re:Addendum: why you can't use x-rays. (Score:1)
negative mass (Score:1)
But if faster than light travel winds up being the result of sticking your finger down your throat, I for one will be surprised.
yes comrade (Score:1)
I am disgusted by the Slashdotters who pretend to be good socialists when you talk about this "Lee-nooks" thing, but then you see they are still capitalist swine when you try to discuss political socialism.
Come comrade, let us move to China, a land with true freedom!
"I am looking for the nuclear wessel"
I find it a little bit amuzing (Score:1)
That Slashdot has chosen to parrot the press release in using words like "21st century fields".
Of course something that started development in the year 2000, with promising result that may take 2 to 5 years to realize, will CERTAINLY be something from the 21st century field !
Unless someone has successfully built and tested a time machine, I do not think we can invent something from today onwards and call it a "18th century invention" or "12 century BC discovery".
I hope that slashdot from now on will not parrot _every_word_ from press releases it receives.
Making use of intelligent phrases, yes please; Parroting tired and mind-numbing cliches, no thanks!
hmm (Score:1)
Hmm ... few questions (Score:1)
Firsto of all it is a startup. With no real product yet. And what is interesting they raised $56M from private investors. And this is with no product yet. I mean, how much do startups in San Jose have in the beginning? $5-10M (correct me if I'm wrong).
They have facilities near (may be right on) the campus of Northwestern Univ. My freind went there and was not very impressed. So they do some semiconductors, some polymers (hired some people from NU, IBM Almaden research center).
The stuff they talk about in ther PR is not here yet. PBG exists mostly on paper (and by the way best work in this area done by Caltech, Princeton and another University I don't remember right now). It was proposed by Prof. Yablonovich from UCLA (but they use it mostly for micorwaves) in the beginning of 90's.
And MIT. Being great school for science and engineering and all. They haven't done much in photonics. So they have Lincoln Labs (mostly micorwaves and it was not even mentioned) so they have a professor who wrote book on PBG (sorry, photonic bandgap materials), but that was mosly theory. It's all quite far from real device implementation. (Point: it is not the stuff you make money on, yet and it is not quite clear that you ever will)
(redundant:) So, Nanovation raised $56M and $20M of this they give to MIT basically to start almost form the scratch in the field where others already have something done.
... And the money from startup with no real product. And hellova lot money.
I don't understand it.
Re:Addendum: why you can't use x-rays. (Score:1)
"Nanoscale", though, I'm skeptical of. Double-digit nanometres is well into the X-ray regime. Single-digit nanometres is worse. How do the researchers you cite plan to overcome the severe problems encountered with photons of this high energy? Or am I making too drastic assumptions about the scale of the devices being talked about?
Umm... (Score:1)
The wavelength of any visible light is too long for photonic effects to come into play. Trust me, I did my doctoral thesis on this.
Umm... The fact that a respectable university is funding this and that I have heard the technologies the article discusses mentioned by other research groups over the years implies that what the article is calling "photonics" and what you consider "photonics" are not the same thing.
Remember, the article was not written by someone well-versed in the "correct" terms for things. By "photonics", they probably mean "nifty research areas x, y, and z that have to do with light". Read further into the article for more details on what they're actually studying.
Re:Bear in mind the wavelength limit. (Score:1)
Both of these are the exact reason (if my understanding of the technology is correct from the white papers I have read) why optical chips should be faster:
Re:Amazing ParticleWaves... (Score:1)
I don't mean to split particles here... (ahem) but isn't that basically the kind of digital communication we have now? Streams of ones and zeroes? If an entagled "particle" can have its "state" changed and it's "twin" - existing somewhere else - mirrors the state change, then you have a bit that can be turned on and off, right? Just checkin.
Re:More sources of research (Score:1)
Re:hundreds of times faster... (Score:1)
Step in the right direction... (Score:1)
nanotech could use such a partnership (Score:1)
Re:hundreds of times faster... (Score:1)
I'm not saying this kinda research is bad, but what's the point if that bandwidth is not being utilized because the local loop is still so slow? Yes, I know, when you bunch em all toghether you get formiddable bandwidth, but that is also a reason why there should also be money invested in things that utilize bandwidth better, like MBONE.
Cheers!
Costyn.
Re:Amazing Electrons (Score:1)
I'm fairly sure that split electrons, even theoretically, can't be used for FTL communication. Tachyons have never been observed, and no one has any idea how to create them. However, IANAPP (I am not a particle physicist).
You're quite right that latency will be a problem if we colonize other planets. The speed of light is high enough to solve almost any latency problem on this planet by laying cable, but it is already a problem with satellite connections.
communications nothing... (Score:1)
Photons move faster than electrons (unless you use a superconductor, which is way more expensive than using light). So if you replace the electrons flying around inside your chip with photons flying around, you get a faster chip. I'm guessing it would use less power too (heck, just hook it up to your roof: light comes in, you control the flow and put it directly into your system, instead of converting it to electricity, and then back).
Optical computers are going to be the final limit to how fast we can get stuff unless some radically new designs are made in the near future. We're already rubbing up against the physical limit for transistors...now we'll rub up against the physical limit for the speed of particles flying around inside the chip.
Instead of Moore's law, we'll have another law: The speed of a chip is directly proportional to it's size. I suspect we'll eventually find the best architecture for any one given task. Once you get to the limits of size and speed, plus you've found the best architecture for the task at hand, the only way to make your system faster is to make it bigger (or write better software...)
First Things First (Score:1)
Re:Ultra funky tech (Score:1)
Truly. I'm having fun just imagining what "photonic" bussing would look like. I see a chunk of crystal, maybe, that's bevelled in just the right way to get individual photons where they need to go, or a network of these crystals.
And that probably the least of the wierdness.
hundreds of times faster... (Score:1)
And I hope make it a little more difficult for the feds to wiretap...
If you can't figure out how to mail me, don't.
Questions (Score:1)
What kind of opportunities will this bring to an undergrad at MIT? How sensational would an undergrad have to be to have an important place in the project?
The article says that discoveries made solely by MIT affiliates (or whatever) are owned by MIT. What does this mean in practice? What exactly does MIT do with a patent it posesses?
Thanks,
Jack
Re:You don't even know what a Haiku is. (Score:1)
Re:THAT WAS NOT A HAIKU (Score:1)
natalie portman:
natalie portman
magic petrification
i have a statue
open source:
sorry... it's my attempt at variety. teehee.
Re:THAT WAS NOT A HAIKU (Score:1)
Re:Nanovation Tech + MIT Brains (Score:1)
Re:Nanovation Tech + MIT Brains (Score:1)
I think we're talking about two different things.
Cracking a public key system like RSA would be trivial with a quantum computer because, as you say, you can do it in polynomial time.
You don't get the same kind of performance gains when breaking a symmetric cipher (DES, IDEA, any of the AES candidates, etc). In those cases, the best quantum algorithm just reduces the amount of time by a square root. (Maybe this is Grover's algorithm? I'm not famliar with it.) Still a huge difference, but it wouldn't be the end of cryptography.
I have a question regarding the really dumb post you replied to. Did the article even discuss quantum mechanics?
Heh, I didn't read it. Just saw his goofy comments and had to get out of lurk mode for a few seconds.
Re:Ultra funky tech (Score:1)
---
Re:Amazing Electrons (Score:1)
Umm...last time I checked, binary was on-off, and it seems like you can do a lot with binary...
The problem with using twinned photons, IIRC, is that there really isn't any information going from one to the other. Observing one doesn't produce a change in the other, it just predicts the other's state. Think of it this way: you shoot a helium nucleus at some sort of splitter. It splits into two nuclei. You then observe one of the nuclei. If it's deutrium, you know the other is tritium, and vice versa (assuming no neutrons excaped or anything like that, I doubt you could actually split a helium nucleus like this but bear with me for the sake of argument). It's the same idea. If one of the photons turns out to have right spin, the other must have left spin and must have already had left spin. It is just a deduction based on what you already know about the system.
---
The future of slashdot:drivel from idiots (Score:1)
WTF are you talking about? (Score:1)
So i guess we should stop all research into photonics now, right?
Thanks for your drivel.
A step in the right direction (Score:1)
Obviously it's to some monetary or strategic benefit of Nanovation to provide this 90 mil, but I am glad companies like this are leading the way.
I just hope this technology will find it's way overseas. Often, a new technology is developed in the US, Japan or in Europe, and it stops there.
Let's hope it's not the case with this.
Amazing Electrons (Score:1)
So, although a light connection would be fast in small range connections (earth based), it just won't do in long range connections.
Just some humble thoughts.
Re:Amazing Electrons (Score:1)
More sources of research (Score:2)
The field has been around for quite some time, so there's a lot of information on the web about it. Certainly, MIT's partnership will help push things along, but it is only a very small piece of the research puzzle.
Re:Amazing Electrons (Score:2)
Dragon218 dun said:
Well, unless and until someone comes up with a Theory of Everything that both meshes up with quantum mechanics and Einstein, and also allows FTL travel...I think we might be stuck with c as the speed limit for the observable universe. :P
If memory serves, it HAS been proven that change in one "linked" particle can simultaneously cause change instantaneously in its "twinned" particle, but if memory serves the scientists who discovered this doubt very much it will ever be useful for communication. (For one, it probably won't work across lightyears, and for two, probably the most complex method of communication you could do with it would be Morse-code type on-off communication.)
As far as tachyons go...as someone noted, firstly, assuming they exist at all nobody has any earthly idea on how to create them. (This is, in part, because nobody really knows how to make matter with negative mass--which, at least according to our understanding right now, would require something with negative mass because once one hits c unless you're massless or have negative mass you have infinite mass--wanna birth a universe, anyone? ;) Also, Einstein's formulae for the theory of relativity, at least for mass and time dilation, go REAL funky once the magic barrier of c is passed--I've played about with stuff over the value of c in the equasions for shits and giggles, and you get odd answers like, oh, imaginary time and imaginary mass...maybe you really DO end up spawning a baby universe :).
In fact, if memory serves tachyons (at least the predicted existence of them) are what ended up doing in one of the first superstring theories (which had a solution requiring the universe to hae 26 dimensions); most of the newer superstring theory flavours (including M-theory, which is sort of a "superset of sets of superstring solutions" and factors in an 11-dimensional universe of which there are six solution sets involving 10-dimensional solutions) do not predict tachyons (weird stuff like photinos and quarkinos (supersymmetric "twins" of quarks and photons, only the photinos are the "mass" particles and quarkinos carry force), sure--weird enough stuff is predicted that at least one fellow wrote a novel called "Moonseed" of which the major part of the plot line involves VERY funky subatomic particles predicted in some flavours of superstring theory--but no tachyons) and in fact the fact the 26-dimensional flavour of superstring theory required tachyons is considered to be a fatal flaw in the theory.
Now, if you can find a way to create negative mass, I think we can maybe lick that whole FTL-communication thing. Not to mention find a way to make stable wormholes, invent FTL travel, and find out whether black holes really DO become baby universes if they don't evaporate away due to Hawking radiation :) I'm more than certain you'd win a Nobel Prize at the least, not to mention give science-fiction writers wet dreams for the next millennium :) Till we do, or we find a better theory in which FTL travel works without breaking time or mass, we might be screwed though :P
Um, they're talking about more mundane photonics. (Score:2)
What you describe is one of the standard proposals for FTL communications. It is indeed interesting; however, if you glance at the later parts of the article, you can see that they're talking about more mundane applications of photons:
I hope this was of interest. Purely optical computing is neat, but would be less useful now than it would have been a few years ago. More on this in another message.
Bear in mind the wavelength limit. (Score:2)
Bear in mind that your computing element size will be limited by the wavelength of the light you're using, though. While waveguide effects might let you push this a bit, remember that the feature size of current chips is already into the "extreme ultravoilet" wavelength range. The wavelength of an electron (at normal energies) is much shorter, making the feature size limits of electrical devices much smaller than those of light-based devices.
This doesn't mean light-based devices are useless; on the contrary, as was pointed out, they tend to dissipate considerably less heat than (conventional) electronic devices. It may also turn out that it is easier to build three-dimensional optical devices than it is to build three-dimensional integrated circuits (both have been done; ICs are just very difficult with current processes). However, I'm skeptical of claims that optical devices will _definitely_ be faster than even the best electrical devices.
Addendum: why you can't use x-rays. (Score:2)
Oh yes - before you suggest just using a smaller wavelength of light, that runs into two problems, both due to the fact that your photons wind up having very high energies.
In order to carry a signal, within a given sample period you need to have on the order of n^2 photons, if you are trying to measure n signal levels. This is due to the statistics of measurement errors. The least painful case uses a binary signal, with only two levels (on and off). However, you still need to send between two and four photons per clock to be reasonably sure of detecting "1"s. The problem is that as you reduce feature size, you are both increasing the clock rate and increasing the energy of the photons used. Power dissipation per communications stream goes up as the inverse square of the wavelength. As you will be packing more communications lines on to the chip, your actual power dissipation will be even worse than that. Thus, you rapidly run in to energy limits when reducing the wavelength.
Visible light is about as energetic as you can get without damaging chemical bonds in at least some materials. Many materials can resist higher-energy photons, but many can't (think of plastics that turn yellow in the UV light from the sun and fluorescent light bulbs). When you start moving into hard UV and soft X-rays, the problem rapidly gets worse. Your energy per photon is considerably higher than the energy stored in the chemical bonds in your material. Thus, you will get fairly frequent interactions where chemical bonds are broken or rearranged. Your material will degrade over time, probably quite quickly with the brighness you'd need (see first point).
Like I said, optical computation is a neat idea, and is very useful for many things, but is unlikely to completely replace electrical computation.
Re:Bear in mind the wavelength limit. (Score:2)
1.The lowered heat dissipation would allow smaller gate sizes without crosstalk/circuit failure becoming serious problem, and
2.the ability to go three dimensional shortens the actual circuit pathways.
You seem to be overlooking the points that I raised in my previous post, and are replying only to the caveats.
Regarding 1) - electrical circuits are already denser than optical circuits ever will be. Feature size cannot be much smaller than the wavelength of light used. This eliminates the possibility of "smaller gate size".
Regarding 2) - Check out your own quote of my post. You _can_ build three-dimensional electronic circuits. Several groups have been playing with this for a while. There are engineering difficulties when you try to do it in production quantities, but nothing that can't be overcome.
3.An optical circuit based bus in the computer would remove the bandwidth of the bus as a major bottleneck/expense.
This, I agree with completely. However, I feel that optoelectronic systems would be more practical than purely optical, for the reasons mentioned above.
A few comments (Score:2)
(Flame retardant: yes, I've conducted a search and found it out. I'm merely pointing out that the PR rants for pages and pages about an undefined word. Maybe it's just my innate dislike for the phony buzzword-heavy writing style. Ah well.)
Anyway, this is really interesting. Obviously there are plenty of "but"'s, as other posters have no doubt pointed out by now, but this is a real step forward in research. And it seems rather close to my primary field of interest (self-replicating artificial molecules) - maybe it will give me a chance to actually do some official work on it. Really makes MIT look better and better to me. Hmmmm, I can see it already... "Rafael Kaufmann, Ph. D."... oh yeah.
By the way, I'm still waiting for my nanites!
(P.S.: <spelling-nazi>The correct form is "revolutionise".</spelling-nazi>)
Nanovation Tech + MIT Brains (Score:2)
The ramification of those circuits is that we could essentially have CPU's 10-50x faster than current chips, with much lower energy consumption as well.
Anyway, this announcement strikes me as good karma for both Nanovation and MIT. One of the aspects which struck me as being highly positive is that MIT researchers will be free to publish their findings, although Nanovation will have the right to patent the devices.
The question I haven't answered for myself from the press release is whether or not the publishing of those results would allow others to develop technology without breaking the patents... Comments anyone?
Ultra funky tech (Score:2)
Imagine having stateless logic gates that you can send a thousand signals through simultaneously, or having a device as tiny as a pair of glasses that beams a perfect 3D image at higher resolution than your eyes can distinguish directly onto your retinas.
This stuff is da bomb.
Re:Nanovation Tech + MIT Brains (Score:2)
This is incorrect. Shor's quatum factoring algorithm achives a polynomial time bound for factoring large numbers. I believe the square root improvment is grovers algorithm for general inverse problems. It is still possible that there is a large factor sitting in the engenering problem of constructing a quantum computer which would mess up Shor's algorithm, but there are currently no credible (information theoretic) arguments for such a problem.
I have a question regarding the really dumb post you replied to. Did the article even discuss quantum mechanics? I did not look vcery closely, but I did not see it discussed. Meaning the donut message you repled to had about 0 clueons. I noticed the same person later on in the discussion making some wild claims about this article saing quantum entangelemnts would create FTL communications. I'm thinking maybe we may have a new type of troll?
I suppose someone could submit a story called "What
Jeff
Re:THAT WAS NOT A HAIKU (Score:2)
natalie portman
is naked and petrified,
happy is the troll
Note that I'm not the natalie portman troll guy, and I apologize to him if he's offended by this haiku.
Re:hundreds of times faster... (Score:2)
Quantum Entanglement (Score:2)
A quantum channel can also be designed to untappable, (search for quantum crytography to find out more).
Meanwhile quantum computers can bring the same exponiential boost to some mathmatical problems, one of which is factorizing prime numbers (goodbye RSA and https).
Researches are current bizzy developinf the basics for quantum computation and quantum telecommunications. Theory is still decades ahead of practice. E.g. Physists have already designed quantum telephone exchanges in theory, (using entanglement swapping) to distribute entangled pairs of particles, but it will be years before you can buy a quantum router.
Re:Questions (Score:2)
From my experience, this is likely to give a lot of undergrads a spot on the research team, but in terms of "important places" -- highly unlikely. It is very rare that an undergrad obtains a top role in a major research project, let alone a widely-publicized one such as this.
The article says that discoveries made solely by MIT affiliates (or whatever) are owned by MIT. What does this mean in practice? What exactly does MIT do with a patent it posesses?
In practice, you will find that all universities retain IP rights to IP created by their professors -- otherwise why would they be paying them? (No, teaching is far too minor to be the primary source of income). A patent held by a university usually only means that those in the industry who want to actually implement the idea have to pay royalties (otherwise the world's scientific community would just be a public-use R & D team). A university would not, generally, put restrictions on a patent that would prevent the work from being at the basis of future studies.
Geosynch is pretty far up. (Score:3)
While it's true that satellites tend to have slower processors, latency due to the speed of light is very real. Think about your own calculation - for a satellite in Low Earth Orbit, about 300 km up (about 186 miles), you have a 2 ms latency round-trip. And that assumes that the satellite is directly overhead.
In practice, the situation tends to be much worse than this. Viewing at an angle can easily add a factor of two or three here, but that's for LEO; many satellites are instead in geosynchronous orbit, at about 40,000 km. At this altitude, they have an orbital period of 24 hours, which means that you don't have to keep adjusting your satellite dish to track them. However, it also means that you'll be getting about 130 ms delay _each_way_ to the satellite. Round-trip from one point on earth to another, and you start to see why you get latency.
Even fiber over the surface of the earth will give you latency. Per thousand km, you get about 3.3 ms latency each way (ping of 6.7 ms). The farthest point from you is about 20,000 km away. That's almost (but not quite) as bad as geosynchronus orbit.
Re:Nanovation Tech + MIT Brains (Score:3)