Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Technology

Nano-Plotters May Reduce Circuit Size 41

osm writes: "Using nanoplotter pens dipped into organic molecules, this device has drawn structures with lines 15 nanometers wide. This could be used to produce circuits several orders of magnitude smaller than what is currently possible. Full story is on latimes.com." Understanding the fundamental processes of electron and ion transport and chemical reactions that occur within such films is vital to the development of new molecule-based chemical sensors, opti- cal switches, electrocatalysts, nanofabrication technology, and other electronic and photonic devices," says the Web site of Dr. Chad Mirkin, head of the team involved in this research.
This discussion has been archived. No new comments can be posted.

Nano-Plotters May Reduce Circuit Size

Comments Filter:
  • you know this mirkin business has been posted here two or three times already. let's just say it's not new anymore.

    so let's ask, what are the direct applications? has anything been done so far which is electronically interesting? feel free to answer. i personally want to see a device made with this technology very soon.

    while the latimes.com article is plugging the economical viability, let's get a handle on how much a scanning probe microscope costs (i think they're using a Park Scientific setup (now Thermomicroscopes)). i've seen they're machine and believe it's on the order of 100K. this is similar to what a used SEM and deposition equipment (metal evaporators, etc.) would cost. now look at the resolution. as far as i am aware, the minimum linewidth achieved has been on the order of 5 nm, while the consistent linewidths are something like 15- 30 nm. this is comparable to consistent linewidths in good SEM lithography.

    while the technology is interesting, it's still a ways off from being useful and they are definitely making bank from the hype surrounding the technique. but we should still remember that other techniques have achieved comparable or far greater resolution such as STM electrodeposition and STM nanomanipulation (of ATOMS dammit!).

    to the person that said quantum effects will limit the applications of molecular scale electronics, please be a bit more specific. all of solid-state electronics is quantum in origin, yet some of the 'quantum effects' you allude to may include weak-localization, universal conductance fluctuations, conductance quantization, etc. are these an issue at room temperature? depends on the mean free path and the electron-phonon scattering length. thermal fluctuations tend to smear out quantum interference effects at higher temperatures and coulomb blockade effects (ie, as in single electron transistors) also suffer from similar smearing.

  • Finally! My nanomachine construction plans have been long stymied by the lack of a printer capable of outputting full-size blueprints. Only problem with this nanoplotter is, I can't seem to figure out how to change the pens.

    Plus I can't seem to find Windows drivers... :)
  • not exactly sure what you mean by 2.5d layout... but some people in my lab are working on a design that involves layering processors one on top of the other, and using optical interconnects (instead of wire) to pass information int the z-direction.

    anyhow, something I've not heard mentioned, but that is extremely important when dealing with wires this size is the propagation delay through the wires. under current technology, transistor speeds have already outstripped wire speeds (ie, how fast an electron can move along a wire). as wires decrease in diameter, they get slower (more resistance). definitely will be a problem if they're planning on using this technology to simply miniaturize existing hardware styles.

  • Is he a dot, or is he a speck?
    When he's underwater does he get wet?
    Or does the water get him instead?
    Nobody knows, Particle Man...
  • Modern photolithography techniques WILL work for 3d ICs if we can find out a way to grow silicon crystals on top of amorphous surfaces. The top layer of ICs is pretty much a silicon oxide, which is not a crystal structure. If we can grow a silicon crystal on top of this oxide, we can then start a new IC right on top of the old layers.
  • by Paul Johnson ( 33553 ) on Tuesday June 13, 2000 @02:52AM (#1006707) Homepage
    The next big issue for computer circuitry is going to be 3d layout. At present we have 2.5d: you can layer some circuitry on top of other circuitry, but no more.

    Full 3d is going to require nano-fabrication techniques: no form of lithography will cut the mustard. While painting of this kind is not the full solution, its definitely a step in the right direction.

    There are of course many other challenges in 3d circuitry, cooling not least.

    ISTR some work done by IBM with tiny balls that had a few hundred devices on the surface of each and were packed into a cubic array. Does anyone know where that went?

    Paul.

  • So I doubt that quantum mechanics will really form that much of a barrier to the size of circuitry. It'll require a new methodology and new techniques to be sure, but it's hardly like it'll be impossible to make some analog of electronic circuits at very small scales.

    Unfortunately what happens at about 0.05 microns is that wires pressed that close together form a structure know as a Josephson Junction (a magnetic flux-sensitive quantum barrier). Under certain conditions, electrons can arbitrarily jump from one wire to another, depending on the exact number of flux quanta mediating the junction at the time. Unfortunately in a standard work environment, controlling magnetic fields to the level of quanta is next to impossible, thus setting an upper limit on the coherence of classical circuitry at the sub .05 micron limit. This was my initial assertion and it still stands, despite insubstantial denials.

    That is, of course, not to say however that there will be no further progress along the road of computing. Quantum effects may one day prove to be the saviour of computing, and time will off course tell whether we ultimately have a quantum computer sitting on our desks in ten years time.

    However, you all know as well as I do that Moore's law is coming to an end, and to attempt to deny that without sound scientific reasoning seems to me like a bad case of denial.

  • Unless we do get some new techniques then we will eventually hit the quantum barrier, but probably still later than expected.

    Sir, In many industries we are already there. I work with a firm that spends a great deal of time with rotary dynamics. Currently we are nowhere near the quantum barrier however, the leaps made in using the 1860' rotaryfan algorithms in applied dynamics is producing tremendous results. The industry standard components you referenced are still the baseline, however when implemented with non-standard power and AI systems to control them, (sort of a neural network) the performance of the rotary response is greatly acheived. We were close to reaching the hof threasholds, however the glass ceiling we were shooting for was re-assessed and it looks like we have a bit further to go. In any event the SE architechture does appear to have great advantages over conventional methods.

    Regards

  • No, I'm not denying that the end of Moore's law is inevitable given current production techniques and circuit technology, rather I'm saying that there will be new developments which will allow us to continue using the same basic model of circuitry without going to a radically different architecture, such as quantum computing.

    See this article [sandia.gov] for an example of a component which rather than being hampered by quantum effects, instead relies on them to work. There are other similar efforts underway, I think IBM are working on similar projects, and I think that by the time Moore's law ends we'll have the basis for a new kind of electronic circuit based on the same general principles, but different component architectures.


    ---
    Jon E. Erikson
  • Another current area of research interest is using carbon nanotubes as circuit elements. These little wonders can behave as conductors, semiconductors or insulators. Combining these different types, one can construct transistors, diodes, etc., making (almost) all-carbon circuits an eventual possibility. Sorry, no URL right now.
  • This [ibm.com] is the article I think you are talking about. I think you are talking about the electronic states in magnetic nanostructures, which would have been in that research article. If not, sorry for the mistake. Its an interesting article nonetheless.

  • If I remember right, resistance is inversely proportional to the cross-section area of the wire (i.e. how fat the wire is), but also directly proportional to lenght. something like R = K L/A, where K is some constant Assuming we're only changing dimensions in the x and y directions, L and A will be reduced by the same factor. i.e. smaller by a factor of s implies R = K (s*L / s*A). I'm asusming the z axis is not affected by this technology because that part of manufacturing is controlled by chemical reaction rates, instead of photolithography. If z were reduced to, A would be sized by a factor of s^2 instead of just s. So, I could be wrong, but it seems to me that if the x and y dimensions were reduced by the same factor, the effective resistance would remain the same.
  • The use of MALDI or 22320" TOF has a fundamental impact on the analytical process, because it detects biomolecules directly and therefore labels and separation steps are not needed. MALDI 22320"-TOF MS directly measures molecules during time-of-flight according to the difference in molecular weights, combining separation, detection and characterization in one single step. Characterization of a molecule is obtained because the weight of a molecule is a physical standard and allows unambiguous identification. This combined process of separation, detection and characterization takes place in less than milliseconds because the molecular ions are flying through the high vacuum without resistance, and, with GPL'd bioinformatics software, the signals are immediately recorded as electronic signals in the computer, ready for further data mining and archiving.
  • The nanoplotter is designed to plot holograms, gratings, masks, etc. with extremely high precision and resolution combined with high speed. The plotter writes with a krypton or an N-P-now19 laser on a coated glass master. The system has high plotting flexibility. The plotter consists of a spindle and a linear stage. Both axes use air bearings, direct drive motors and high-resolution encoders. Plotting is done in a polar inchfan coordinate system (data are remapped to this coordinate system from common formats). The plotter writes with an HeNe or a krypton laser that is modulated with bragg cells. A focus detector keeps the beam in focus on the glass master. The plot size is only limited by the glass master, which can have a diameter of up to 160 mm. Data can be read in common formats and coordinate systems. Plotting is done in a polar system. The resolution is dependent on the size of the hologram. Small holograms can have extremely high resolution. Mechanical/electronical resolution of 100 nm is possible. The optical resolution is better than 600 nm. 256 levels of grey-scale can be achieved.
  • Are those like optical switches.. or do they have less 'cal' and more 'opti'?
  • Just fscking great.

    I showed the article to our CAD monkey. He blinked twice, stood up, walked to the President's office and quit.

    Thanks open-source man. Now we gotta find a new CAD clown because this one just went home muttering "Gonna go flip burgers at McDonald's..."

  • by Anonymous Coward
    This looks like Needle printers all over again.. just nanosized this time.

    Talk about reading the fine print.. :)

  • by ghutchis ( 7810 ) on Tuesday June 13, 2000 @07:04AM (#1006719) Homepage
    Sigh. Another molecular electronics thread...

    OK, so several other people have mentioned the main point--you can't think of these as continuing normal electronics. It's a whole different world because you're not using bulk properties anymore. Current silicon transistors would break down around 5 atoms or so (there are a number of papers pointing towards this barrier and I think a few have been posted to /.).

    So you're now thinking about charge carried through a molecule. But here's the problem with trying to make circuits out of these. How do you connect your wires?

    It's the interconnects that's the real problem.

    People like Dekker have shown that you can get conduction across carbon nanotubes or DNA or other single molecules. But the trick is getting the "wires" to stick together so you can actually do something useful.

    Yes, Mirkin's nanoplotter research is interesting. But you can't use it to lay out circuits until you get good molecular wires and good interconnects.

    My gut feeling is that self-assembly, perhaps in combination with something like a nanoplotter, is the best way to do an interconnect. But hey, I'm only a grad student... What do I know. >:-)

    -Geoff
  • Are you referring to IBM's proposals for "vertical transistors?" I think they're going into the prototype phase--I've seen some micrographs of them and they look pretty good. These don't require nano-design yet, but it's a step away from current chip design. It's much closer to true 3D design.

    The idea is similar to real estate crunches in big cities, actually.

    Why do people build big skyscrapers in Wall Street? The space is so valuable that they build up (and down) instead of out.

    Well, it's the same idea here. At some point, you just can't get more transistors on the chip, so you make the transitors go up-and-down instead of side-to-side.

    Voila! Better density through science. :-)

    I would guess you'll see vertical transitors in memory chips in a while.

    -Geoff
  • This technology will probably have limited use in producing circuits that are ever finer, as this takes you well into the realm of unpredictable quantum effects, where circuits can no longer be guaranteed to behave in a predictable way.

    In fact, quantum effects start to come into play at around 0.05 microns, a resolution it will be possible to achive using Extreme Ultra Violet and/or X-ray lithography, ultimately rendering any technological attempt to produce smaller circuitry quite pointless.


  • ...ultimately rendering any technological attempt to produce smaller circuitry quite pointless.

    Ah, yes, of course. And we will never get any useful amount of energy from atoms, and space travel is utter bunk, and there's a world market for maybe five computers. And an operating system written by unpaid amateurs could never compete with MS-Windows.

    There's a jargon file entry [tuxedo.org] that mentions "a paper from the late 1970s that computed a purported ultimate limit on areal density for ICs that was in fact less than the routine densities of 5 years later."

    ------

  • Could this sort of technology be used, rather tahn for electronic circuits, for creating nano-machines?

    I seem to remember that these can be created by layering materials so would anything stop this technique from being used?

  • by Jon Erikson ( 198204 ) on Tuesday June 13, 2000 @02:03AM (#1006724)

    This technology will probably have limited use in producing circuits that are ever finer, as this takes you well into the realm of unpredictable quantum effects, where circuits can no longer be guaranteed to behave in a predictable way.

    I'm going to have to disagree with that last statement. Sure, at the scales we're talking about here quantum effects come into play, but they're hardly unpredictable. Unless we're talking about individual quantum processes, the outcome of which is indeed probabalistic then we can statistically predict what will happen with a large number of quantum processes, which is what will be taking place at this level.

    So I doubt that quantum mechanics will really form that much of a barrier to the size of circuitry. It'll require a new methodology and new techniques to be sure, but it's hardly like it'll be impossible to make some analog of electronic circuits at very small scales.


    ---
    Jon E. Erikson
  • Particle Man Particle Man Doing the things a particle can...
  • A significant problem with smaller circuit sizes is that voltages have to be reduced, the smaller things get. This is to aviod temperature build-up and also to reduce the potential for electrons to drift out of their tracks at corners (or indeed move tracks, similarly to the way rivers move and change when they bend)

    As voltages are reduced, the signal to noise ratio is decreased and it becomes more dificult to distungish between them - this is then compounded by quantum effects that further reduce signal to noise ratios by (amongst other possibilities) reducing a signal (by tunnelling etc) or even boosting a 'zero' signal so high it gets picked up as a '1'

    This can be avioded by clever circuit design but it is a fundamental limiting factor although hitting the eventual limit will be complicated by such things as voltage, heat dissipation (too much heat will affect smaller tracks more than larger ones) and feature size.

    hohom

    Troc
  • by Penrif ( 33473 )
    Holy cow. I can't even start to imagine what kind of precautions we're going to need to use to prevent electro-static discharge on these things. That small of a track is going to be real easy to fry.
  • As the chips get smaller and smaller, and the threat of quantum effects appear and all, it seems like an obvious question of priority (which this article demonstrates) - Make something new and make it work, or make something work and then move on?
    Notice the "several orders of magnitude" in there - it looks like the push is to develop the technology, then worry about it's feasibility.
    But don't pay too much attention to me - I just want the matchbook PC.
  • What's he like? It's not important. Particle Man.

    ;It's a geek singalong! Join in!

    --
  • Quantum effects and mechanics are the *ONLY* reason we figured out how to make a transistor in the first place.

  • Is this where they say that zero-G or orbital production would be useful? Or is that just a scam to go on a fun space ride?
  • Sledgehammer to crack a nut, surely? You hardly need a nanoplotter to print a diffraction grating.
  • just to point out that the legal complications which surrounded previous versions of the same technology (N-P-17 and the N-P-13 Professional) are now cleared up. These versions had fallen foul of New York State and other local regulations; however, these matters have been settled (my firm's regulatory practice was involved), and N-P NoW19 is cleared by all appropriate agencies (I'm not sure about Utah; check your own lawyer, as ever). Particularly, the TPM version is free of all possible complications.

    I realise that not one reader in a thousand will care about or understand this thread, but what the hell, /. has thousands of readers.
  • True, but that's all assuming that the basic technology behind circuitry remains pretty much the same in the future, whereas I'm more of the belief that new techniques will have to be developed to replace the standard components - resistors, capacitors, diodes etc. Unless we do get some new techniques then we will eventually hit the quantum barrier, but probably still later than expected.

    Indeed, the quantum effects that will eventually kill current circuit design are the most likely candidates for producing new ones :)


    ---
    Jon E. Erikson
  • Hmm, I dunno, the N-P-17 did go pretty well, but thanks to the legal angle you wouldn't want it known that you'd got one and was breaking it in. The latest version doesn't need quite the same level of training as the N-P-17, the heuristics must be a lot better.


    ---
    Jon E. Erikson
  • Mmm, quantum mechanics isn't guaranteed to keep us from continuing to shrink circuitry. For one thing, quantum mechanics is probabilistic. It's unpredictable in that you can't know what, say, one particle is exactly going to do, but you can know the statistics well enough to know what thousands of particles will do.

    The problems is that current circuitry is not designed with quantum mechanical effects in mind. You need something like the quantum mechanical transistor that a lot of people are working on, including a research team at Sandia National Labs [sandia.gov] -- devices which are designed with QM effects in mind, and are optimized to take advantage of those effects.

    Sargent

  • The big problem isn't really resistance (although that is certainly a part of it) -- its the RC constant and the drive of the transistors connected to a wire. Making some gross assumptions we have:
    R proportional to L/A or since Z doesn't change, L/W
    C proportional to L*W
    RC proportional to L^2

    transistor drive proportional to feature size squared.

    So, when we move to a new process with a smaller feature size (lets say 1/2 of the current), the wires reduce in length by 1/2 and the RC reduces to 1/4. This is good since the drive of our transistors has also dropped to 1/4. But of course, the whole reason for moving to a smaller process is to get more transistors on an economical die, so the die size remains fairly constant -- we just cram four times as many transistors into it! And the wire length remains the same so our poor 1/4 drive transistors have to charge up a wire with an unchanged RC.

    Notice that all of this ignores the effect of having to reduce voltage to avoid hot-electron effects -- these are quite important.
  • Someone about a year ago claimed they had a technique for shaving a silicon chip down to remove the substrate, and bonding the remaining IC to the top of another chip. The idea was to make modular chips.

    Since the time it takes to do nanolathing is a function of the number of features, instead of the number of layers as with photolithography, it may be useful to have a machine that's responsible for producing only part of a particular chip, instead of the whole thing, to keep complexity down.

    One configuration of lithography machine could then produce cache memory for the whole product line, or FPUs. Only the control layers would vary, in that they would have to handle N-way parallelism, where N is controlled by the type of processor (consumer, workstation, server).

    I can't seem to come up with the right search engine incantation to locate the announcement, however. Maybe someone else remembers what I'm talking about?


    -
  • Disclamer: because I'm employed by N-P-13's parent company, I'm not going to identify them direcly.

    I believe you are referencing Northern District of California, who ruled on the N-P-13 issues.

    N-P-13 is an offering from a software development and publishing company. Among other things, the company makes educational versions of their software, which are available to students and educators at a significant discount. Defendant SIDIF buys and sells computer hardware and software on the open market. N-P-13's parent company alleges that SIDIF improperly acquired an educational version of N-P-13 software, which it then adulterated and sold as full retail versions to non-educational users. In its complaint, N-P-13 alleged that the agreement was a licensing agreement and not an actual sale, that SIDIF infringed N-P-13 copyright, and that SIDIF infringed N-P-13 trademark.

    The court found that the Off Campus Reseller Agreement, which governs the educational seller's relationship with N-P-13's parent company, was a licensing agreement and not an actual sale. Because the first sale doctrine, implemented by the defendant, is triggered only by an actual sale, and because a copyright owner does not forfeit his right to distribution by entering into a licensing agreement, this factor weighs in favor of the plaintiff. The OCRA is a licensing agreement. Thus, contrary to SIDIF's assertions, the OCRA does not represent a first sale between the seller and N-P-13. SIDIF's failure to trace its N-P-13 products to a sale renders the first sale doctrine inapplicable and subjects SIDIF to potential liability under copyright law.

    The court also found that SIDIF committed copyright infringement as a matter of law under Section 501(a). By obtaining Adobe software from a party to an N-P-13 licensing agreement, N-P-13 was bound by any restrictions imposed by that agreement. Thus, SIDIF committed copyright infringement.

    Lastly, the court found that SIDIF did not infringe N-P-13's trademark. Although N-P-13 attempts to parallel its case to Shell Oil, the Court found Shell Oil distinguishable. The court found that the mere distribution by N-P-13 of admittedly unadulterated software is insufficient to establish trademark infringement.

    In N-P-13's parent company, v. N-P-13 Inc. the Northern District of California held that the agreement under which software was distributed was a licensing agreement, not subject to the Copyright Act provision that copyright did not extend to resale of copyrighted items following their initial sale. The court also found that the license agreement applied to the distributor, even though it was not signatory. Last, the court held that the distributor committed copyright infringement by violating the licensing agreement.

  • It depends on how you take it. If you expect the
    computing power of a single processor to double
    every 18 months, than it will end. But, with
    the process of putting more processor cores on
    each die you can still maintain this fantastic
    exponential growth. Look at the POWER processor.
    Manufactured on industry standard processes, it
    has significant advances not utilized by the other
    processors being made.

    Instead of making things smaller, we need to look
    at parallelism. Making more work done on each
    clock cycle. The Athlon really shows what can be
    done with multiple instructions in flight at the
    same time. My K6-2 tops out at 2 per cycle, while
    the Athlon can execute 4 consecutively.

    You don't need to make a single monolithic
    processor; especially when multiple processors
    on the same die can do so much more work.

    One of the major setbacks of the processors of
    today is out of order execution. If this work were
    done explicitly by the compiler, great strides
    would be made for faster simpler processors.

    Circuits don't have to be faster to make a
    computer faster, it's all in the architecture
    and efficiency of the work that it is doing.
  • The revolutionary aspects of nanotechnology should force the law to change prospectively or at least provide a mechanism to properly address nanotechnology. All too often, the law is one of the last societal institutions to adapt to technological advances. Improvements in communication and societal consciousness particularly with the revolutionary advances in technology that the world is experiencing should make prospective change feasible.

    Regulation and the ethical considerations in the computer law and biotechnology demonstrate the problems that can occur with revolutionary new technology which contain new concepts. The window of this opportunity may soon be too late as every day passes. New companies are being incorporated that focus exclusively on nanotechnology and revolutionary advances in this area are likely to occur at any given moment. These advances, once they begin occurring, will likely accelerate much faster than past technologies and lead into what is known as the "Nanotechnology Revolution."

    Although the prophetic vision of Drexler's coming era of nanotechnology was published over fifteen years ago, although the Nobel Prize in chemistry was awarded to Smalley a few years ago for his nanotechnological achievement, although the brilliant minds such as Ralph Merkle at Xerox PARC and now Xyvex Corp. have been long working hard at designing atomic manipulators, although leading scientists in the field, such as Robert Freitas have authored volumes of texts on the applications of nanotechnology to medicine, although all these achievements and many others have been occurring for over a decade, there has only been one legal article devoted to the subject and it was published over five years ago.1

    As a matter of fact, the term "nanotechnology" only appears in thirteen articles in all the published legal scholarly materials and law reviews in the United States. Thus, despite the improved foresight and opportunity of prospective change and all the legal discussion of biotechnology and cloning, discussion of nanotechnology in the legal arena is almost nonexistent at a point in time that is dangerously close to revolutionary nanotechnological developments. Although the legal field and scientific fields are akin to night and day, integration and understanding between the two fields are crucial and discussion of the legal implications of nanotechnology is necessary immediately for properly developed and educated regulation. One particular area of nanotechnology that necessitates prospective regulation is a particularly interesting class of nanotechnology termed "replicating nanotechnology."

    This is perhaps one of the more important classes of technology in all of man's technological development. Although nanotechnology will change our all our lives as we know them, it does not necessarily follow that changes in the law are that revolutionary. It might require only slight modification and perhaps the law is already present and we simply need to readjust it accordingly. As Amelia Boss, is a law professor at Temple University School of Law and a member of the Permanent Editorial Board of the Uniform Commercial Code ("UCC"),2 has stated in our discussions about the legal implications of nanotechnology: "It is always easiest to say Ithis is new technology; we need new law.' The harder, and more interesting challenge, is to demonstrate how the new technology simply repackages old problems, and how concepts that have developed over the centuries really do work when applied to new situations." Professor Boss's assertion could be true, at least in theory, however, nanotechnology will likely necessitate a revolution in legal adaption.

    Nanotechnology law will be unlike biotechnology law, computer law or any other type of revolutionary technology. One will not be able to simply take in account all existing law and analyze where in the current body of existing law changes or additions will occur with nanotechnology. Initially, this will be possible and inevitable, but Nanotechnology will replace all manufacturing processes for all present goods, not only producing them at a higher rate, but goods and technology that will be able to respond to an almost infinite amount of properties or tasks that are logistically possible. When it begins to be applied to every aspect of our lives, it will affect every component of our lives and law. Computers are a range of products. The automobile is a range of products. The assembly line was applicable to a range of products. Fire is even applicable to a range of uses. However, nanotechnology will be applicable to almost all processes, even biological. The law will not be able to respond in a similar manner as it has in the past. Before analyzing replicating nanotechnology, it is important to understand the replicating aspect in isolation before for evaluating the nanotechnological aspect and to distance the distraction of the implications of this newly emerging nanotechnology, which, although it is becoming more and more advanced, the lay person still knows very little about it. The replicating aspect can be first applied in discussing the implications to replicating micro or macrotechnology. This case study approach is irrelevant to analyzing replicating nanotechnology, because they are likely to be developed in conjunction with one another, however, it provides an excellent basis for a mental exercise in understanding the sometimes unfathomable impact of not only nanotechnology, but self-replicating nanotechnology.

"Unibus timeout fatal trap program lost sorry" - An error message printed by DEC's RSTS operating system for the PDP-11

Working...