Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Technology

Perforated Metal Advances Computer Technology 28

TeknoDragon writes "In the July Scientific American there's an article on how conductive metals can be made into optical sieves. Two applications of this technology pursued by NEC are color LCD screens up to six times as bright and photolithography techniques that would help plants upgrade to a smaller fab. "
This discussion has been archived. No new comments can be posted.

Perforated Metal Advances Computer Technology

Comments Filter:
  • ha ha funny, if you came up with a good idea, would YOU want to share it and let someone else get the credit for the discovery. When people discover something, the news says one name. Sure you spent 70 years developing the technology to the 99% mark, but who ever gets that 1% at the end is the one who will be on the news. Open is not human nature.
  • > For nearly 10 years, Ebbesen struggled with the
    > problem, waiting, in the closed-mouth habit of
    > corporate researchers, to make his findings
    > public until he could explain and control (and
    > patent) the phenomenon.

    Sigh. The wonderful effects of closure in science. We could have had this 10 years ago if science was more open. As it is, if the discovering scientist can't solve the problem, then by damn no-one is going to be allowed to solve the problem.

    This article is the clearest demonstration of why we need old fashioned OpenSource Science to return.
  • It's a good thing this came along or we risked falling off the Moore curve.. even so, even these holes aren't small enough for proper x-ray lithography, so it looks like we're still stuck with that five-atom width absolute limit of traditional litho if technology takes this route.

    The question is, though: is squeezing every last breath out of trad litho the way to go? A couple of hexagons on the wall of a bucky tube can form a complete logic gate; molecular nanotech will soon build single-molecule transistors (check out J Ellenbogen's work at MITRE.org) and Ned Seeman at NYU is folding DNA into massively parallel computing devices. These bottom-up routes are to traditional scrapin' and shinin' lithography as Linux is to Windows 3.1. Maybe we don't need new ten-billion dollar fabs; maybe we just need some fresh ideas.
  • It seems that for every breakthrough that would obsolete an important technology, there is an equal and opposite chain of refinements in the old technology to keep it competitive.

    We've seen this many times now in the computer industry; a completely new technology that will utterly displace hard disks, for example, but in the 5 years that it takes to go from the lab to the factory, hard disks become ten times as dense and drop to a tenth the old price. In the end, the new technology is obsoleted by the new economics of the old technology even before it gets started.

    With better conductors (copper) and finer etching via something like this, CPU technology is likely to continue to follow the same lines for a long time still.

    Just something to consider when talk of optical processors or some new molecular switching technology hits the news again.
  • Forget the applications they are currently anticipating - the fundamental physics here is the exciting thing - very wierd!

    BTW, where/what's the .cx domain ?
  • by pen ( 7191 )
    Is this the return of perfocards? :P

    ---
  • This is doubtless true. Even Einstein didn't create the atomic theory behind nuclear energy without help (manhattan project).

    As a side note.. if it's true that electrons on the surface of the metal is what cause this.. wouldn't that mean that electromagnetic fields would have a major effect on it - especially high frequency transmissions?



    --
  • For cheaper Fabing of IC's, but this will have to move fast to compete in the flat screen market. Theirs all ready Electronic Ink. making paper thin displays, although they may not do video but its a start. Also thier is a companie making a phophorus flat screen display where their is an electron gun for every pixel, and they have OEM demos out.

    Unless this is cheaper and makes it to the market in time, it will not survive. As for Cheaper Fab's this will help the processor market a lot.
  • I agree its intriguing, i especially like their discription on what actualy is happening. I was mearly pointing out that if it is intented to enter the flat display market, its going to be rough.
  • Posted by 2B||!2B:

    Agreed. This was indicative of three main factors in why we had to put up with mediocre technology for ten years:

    1) The researcher might be greedy (possible; he's been collaborating in a managerial position with NEC since the 80's) or a show-off (common in researchers, especially when, like him, they're looking for a full professorship (which he has since obtained in France)).

    2) NEC has way too restrictive nondisclosure requirements. This is almost guaranteed. He's been working with them for over ten years, and has a managerial position. My boss did a internship once with them a few years ago, and they all but prohibited from ever touching a computer again in his NDA. Like the rest, they're _very_ greedy (hiding desperately needed technology for _10_ years!?), immoral in achieving their goals.

    3) Research universities set unrealistic requirements on their professors for both jaw-dropping publications and for high-income research projects. I have seen this far too many times. All they ever think in the administration of any research university is "show me the money!". One of the greatest enemies of invention and application of new technology has long been college administrators. If they don't see serious corporate sponsorship coming out of a project, the project can get canned. Wonderful professors are frequently denied tenure because they're too busy actually teaching students (which, I had mistakenly assumed, is part of their job) or advancing science in things that matter (instead of worthless projects which are sexy to corporations who want a tax write-off for their large donations). I saw it happen to a friend of mine. He was a great teacher, plus his research was in real-life applications involving medical imaging. But, no, it wasn't a quick buck, it was _usefull_ research. And, no, he didn't have a ton of publications (though he, at a young age, was already one of the main references in graphics textbooks), because he was busier inventing and applying than writing. Now that they didn't let him do useful research, he's off at a government lab figuring out better ways to blow up the planet instead of curing its inhabitants.

    Once something is done, schools are sometimes overly restrictive on its use in commercial products (or clueless on how to get it there), and many wonderful new technologies rot away on paper. Linux would surely be far better than it is now if universities would give us the technologies they have already invented and will probably never get another dime out of. For that matter, I bet AIDS and cancer would have been cured by now if research universities weren't so overly protective of their results.

    And in this particular instance, we've been putting up with some pretty pathetic LCD screens for a decade because of the petty concerns of some a**holes in universities and corporations.
  • I don't think so. It has much more to do with advertising/marketing than improvement in quality. How else can you explain the incredible popularity of the painfully incremental upgrades of windows and x86 chips? Is Windows 98 1000x better technology than Windows 95? I don't think so!
  • When I read the article, I had similar thoughts -- and also one other: Extend the GPL to science?

    Eg, "you may use this discovery, and you may make money out of it, but any discoveries you make as a consequence of working on it must be made available under the same terms".

    Would that work / could it be made legally binding/etc?

    --
    Repton.

  • nope the theory was all his, the manhattan project was to put his theory to use and actually build a nuclear weapon

  • Research scientists need funds to do their work and the only way you get money is to produce results from your lab. If you spread the work around the entire population of scientists, which lab get's funded?

    By the way, where did you hear about your myth of OpenSource Science? Secrecy has ALWAYS been a part of scientific endevours. While it's still open to dispute, it seems that both Newton and Liebniz were working in secrect developing Calculus simultaneously (both doing so in hopes of getting more funding for future work).

    I'd bet that the majority of scientists and mathematicians have secret projects that they've been turning over in their minds for most of their careers hoping to finally crack. Not only does this drive them throughout their careers, it's probably the best source of truely novel ideas I can think of.

  • But eventually it will be more profitable to jump to new technologies. Look at monitors; there have been dozens of new display technologies over the past 20-30 years aimed at replacing crts and similar to CPUs, manufacturers keep refining the current technology to keep pace, but LCD displays have finally broke through onto desktops and they're starting to appear everywhere.

    Sure, computer technologies die hard, but they do die eventually. :)

    -Paul
  • Electronic ink isn't going to replace TFT for a long time if ever. LEP (light emitting polymers) will miost likely be the LCD's replacement in displays once the technology becomes more refined. Which are like 10th the price of an LCD for the same size display AFAIK.
  • I don't really care much about it's application, it's just cool. Of course it is good for business, brighter cheaper LCD's would make me plenty happy, or an LCD that doesnt rob the life out of poor laptop batteries. I see people bitching about GPL science and all the like, but I'm sorry, everything shouldn't be open source. Some people like to share their research with lots of people, and some like to work on it privately, it's their choice. Geez, damn fanatics. All fanatics must die.
  • I also enjoy solving problems by myself. Science has very little of that. It's mostly generating vast amounts of datain the most boring fashion possible, and then drawing a theory out of all the data. Then Testing your theory over and over again in every possible way in the most boring fashion possible.
    Ever have to clean out a thousand test tubes by hand?

    Later
    Erik Z
  • >Research scientists need funds to do their work and the only way you get money is to produce results from your lab. If you spread the work around the entire population of scientists, which lab get's funded?

    The lab with the reputation of getting the best results the quickest?

    Later
    ErikZ

Never let someone who says it cannot be done interrupt the person who is doing it.

Working...