Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Technology

50 Year Old Computer Still Going 293

The Angry Mick writes "Geek.com is running a blurb on a 50 year old CSIRAC computer that is apparently still functional, if lurking in an Australian museum. Sporting a whopping 2K of RAM and screaming along at a blistering 300 khz(!) it proves the adage that they really don't make 'em like they used to . . ." Yes, because if they did, they'd be really, really slow.
This discussion has been archived. No new comments can be posted.

50 Year Old Computer Still Going

Comments Filter:
  • I wonder... (Score:2, Insightful)

    by ciupman ( 413849 )
    .. how do they cool that thing down?
  • Wow... (Score:3, Funny)

    by Agent Green ( 231202 ) on Thursday December 12, 2002 @07:05AM (#4869248)
    ...30 more years and my Apple //e will have been running for 50 years! Woohoo!
  • I find the idea of a massive computer lurking rather funny. Of course, it could be the 4 Guinesses I just polished off. Oh well, time for bed. I hope I don't have dreams of ENIAC or some other thing now!
  • Imagine... (Score:5, Funny)

    by Russellkhan ( 570824 ) on Thursday December 12, 2002 @07:07AM (#4869252)
    A Beowuld cluster of- oh never mind, where would you fit it?
  • built to last (Score:5, Interesting)

    by nath_o_brien ( 608347 ) <nath@nathans-domain-name.org.uk> on Thursday December 12, 2002 @07:08AM (#4869266) Homepage
    and linking that to yesterday's discussion about the lack of quality these days [slashdot.org], i bet we won't have any/many of today's computers around in another 50 years time... or 50 days for some of them...
    • It's not about quality. This baby had to have tons of valves replaced constantly just to keep it running. And that was when it was NEW!
    • by Goonie ( 8651 ) <robert.merkel@be ... g ['ra.' in gap]> on Thursday December 12, 2002 @07:48AM (#4869440) Homepage
      I wouldn't get too excited about CSIRAC's reliability. The memory in particular had a pretty high error rate, so they often ran programs twice and compared the results to make sure they got the same answer...

      On one occasion, they gave a demo to an organisation called the Institute of Radio Engineers (IRE), but apparently a memory error occurred and the thing printed "CSIRAC welcomes the members of the IRA) :)

    • One computer from 50 years ago...wow. So, that implies a failure rate of what over the course of 50 years? I know there are many more than just that one computer left, but really, how many computers of that vintage are still in operating condition? Not many, I'd guess.

      So, realistically, we can expect "not many" of the current computers to survive 50 years, especially without the regular maintenance I'm sure that old beast receives.

      It's just like humans, really. The life expectancy goes up as a person gets older. Sort of a "made it this far, will probably make it farther" thing.
    • Let's see... [slashdot math] That would require replacing all the IDE drives over 300 times! Also, it's too bad there's only one, because otherwise we could, y'know, build a... Oh, nevermind.
  • Running eh? (Score:5, Insightful)

    by Fizzl ( 209397 ) <<ten.lzzif> <ta> <lzzif>> on Thursday December 12, 2002 @07:09AM (#4869269) Homepage Journal

    By reading the horde of nested articles, I got the impression that the machine hasn't run in decades, and probably would not if powered.



    Correct me if I'm wrong. But please quote a piece that says it is actually running now.

    • Re:Running eh? (Score:2, Informative)

      by Kajakske ( 59577 )
      Original file:
      The machine was the fourth computer to be built anywhere in the world, ran at 0.001MHz, and had a massive 2000 bytes of memory and a behemothic 2500 bytes of storage.
      And it's still running, now safe in the Melbourne Museum, in Australia.


    • Re:Running eh? (Score:5, Informative)

      by Epsillon ( 608775 ) on Thursday December 12, 2002 @07:43AM (#4869419) Journal

      Someone's comment, your quote. It's actually more likely that if they were to power it up/were powering it up, they apply voltage gradually to allow the electrolytic capacitors to re-form and the getter rings/compounds [demon.co.uk] in the thermionic devices to restore vacuum.

      It's not unusual for thermionic equipment to survive long periods of time without use. There is still radio equipment from this era running strongly in museums and private collections and, dare I say, in everyday use. The odd capacitor may fail short once in a while, resistors may fail _high_ (they gradually increase resistance with time - a knownphenomenon) or valves/tubes may lose a heater or go "soft" but I think it's stretching the imagination somewhat to expect it to burst into flames.

      Incidentally, designers from this era often made their chassis live (high potential with respect to ground) so the only thing I'd expect to catch fire would be the young PFY geek leaning on it to get a better view of the thermionics powering up and starting to glow...;o)

      • I've got a 50 year old Hallicrafters (sp?) short wave radio; my parents got it for me from an antique shop when I was a kid. I've never done any maintenence on it, but I fire it up every couple of years, and it still works perfectly.

        This thing is just about the best built and coolest looking piece of equipment I've ever seen. The black steel cabinet is just about indestructable. The glowing green dials and big metal toggle switches look great. The all-tube electronics turns radio static, morse code and beat frequencies into a wonderful eerie ambience that's always changing as you twiddle the knobs.

  • by MichaelCrawford ( 610140 ) on Thursday December 12, 2002 @07:11AM (#4869278) Homepage Journal
    That computer has more ram than the embedded device I've spent the last couple of months programming.

    Although I have the advantage of having a whopping 64k of ROM. I only have to use the RAM for my data. I would expect that computer also has to store the program binary in the 2k. Overlays, anyone?

    Lately I've been finding it worth my time to spend a few hours recoding some functions in order to shave just a few bytes off their stack usage.

    Kids these days, assuming everyone's got 128 megabytes for their application. They just don't code 'em like they used to.

    • I KNOW, in the old days, we had to code in the SNOW! going uphill. Both ways!
    • by ottffssent ( 18387 ) on Thursday December 12, 2002 @07:16AM (#4869303)
      *laugh*

      I don't even notice unless an app is using over 100M (technically, 100,000KB, but who's counting?)

      But it sure would be nice if Windows would notice I have gobs of RAM lying around and start using it for something productive like caching the disk subsystem, rather than the other way around. There is no excuse for a system with >512M of free RAM paging to disk! What ass-backwards VM got stuck into Windows, anyway?
    • 2K of RAM would be very useful (I got 640 bytes). Although direct access ROM is very nice (better than talking through a serial port anyway), having extra RAM would be lovely, letting you compress stuff better, and mix data by category instead of by whether or not it changes.
  • Impressive (Score:3, Insightful)

    by FrostedWheat ( 172733 ) on Thursday December 12, 2002 @07:13AM (#4869287)
    My first computer, the C64 runs at a massive 1Mhz, only about 3 times faster than this machine.

    Commodore released the 64 in 1982, this puts it at 20 years of age. That's 30 years between these two machines. When did Moore make that law again? :)

    Yikes, imagine what the computer world will be like in 30 years time! Assuming MS haven't screwed it up for everyone.
    • by Tune ( 17738 ) on Thursday December 12, 2002 @07:36AM (#4869387)
      Moore's Law includes price. Did you take into account, that you might have payed less when purchasing your 1982 C64 than was spent on CSIRAC, 20 years earlier?

      Btw, C64's feature 64kB which is 32 times 2kB, so at least memory size doubled five times in 20 years, that is: it doubled every four years.

      --
      In theory there is no difference between practice and theory. But in practice there is -- Jan L.A. van Snepscheut
      • by dmaxwell ( 43234 ) on Thursday December 12, 2002 @09:51AM (#4870176)
        The C64 was a CONSUMER item. When the CSIRAC was built there was no such thing as a computer for consumers. It would be more appropriate to compare the CSIRAC to the so-called supercomputers that were availiable in 1982. Machines like the Cray X-MP and Cyber 205 were availiable in 1982. The costs to own and operate them are comparable to what it took to operate the CSIRAC in it's day.

        The UK's weather bureau give specs on the Cyber 205 they were using in '82:

        http://www.met-office.gov.uk/research/nwp/numeri ca l/computers/history.html

        CDC Cyber 205

        200Mhz Clock
        1 MegaWord of memory
        The Cyber had a 64 bit word size so that amounted to 8 MB of ram. So clockspeed has increased over 600 times and memory has increased over 4000 times in that time frame. This is just confining myself to the 205. I didn't look for the specs on other large machines like the Crays that were availiable then.

        Computers as something just anyone could play with were pretty much nonexistant prior to 77 (true you could build something ENIAC-like anytime in the seventies if you were REALLY good with electronics). It's more instructive to see what the kind of money they had to spend on the CSIRAC will get you as time moves forward. Power comparable to the C64 was availiabe in the early sixties for that kind of money.
  • by jki ( 624756 ) on Thursday December 12, 2002 @07:14AM (#4869290) Homepage
    From here [abc.net.au]: CSIRAC's first programmer, Geoff Hill, came from a musical family and he programmed the computer to play popular musical melodies which could be heard through a loudspeaker originally installed for a quite different purpose - to indicate with audible "beeps" when particular points of interest in the program had been reached.

    Not bad for a living dinosaur. Listen to it [abc.net.au] yourself :)

    • For those of you who didnt read the article pointed by the parent poster before listening, that is NOT music being played by the CSIRAC itself but rather a software recreation of the original hardware, and a modern recreation of the original speaker.
      It basically sounds like my old spectrum, only a bit worse :)
      I wonder what all the background noise on it is, though, it certainly sounded like they had a massive computer in the room while playing it.
  • lies, all lies... (Score:5, Informative)

    by Anonymous Coward on Thursday December 12, 2002 @07:15AM (#4869295)
    It's not running anymore, as stated here: [abc.net.au]

    Sadly, it's not an option to make CSIRAC operational again today. Time has taken a toll on this fragile dinosaur.

    So what exactly would happen if anyone tried to relive the magic by switching it on?

    "A lot of its components would not stand having voltages applied to them again," says Thorne. "I think it would probably catch fire."

  • by Viol8 ( 599362 ) on Thursday December 12, 2002 @07:15AM (#4869297) Homepage
    It does annoy me that people , even though its in good humour , snigger at these old machines with their "paltry" 2K memory and slow speed. Yeah , sure they're not exactly a Cray. But look at what was done with this one. Skyscraper design , cloud droplet simulation, antenna design! Lets see even the best programmers used to point and drool GUI interfaces and hand holding wizards try and do that in 2K now using little more than paper tape! The people who designed, built and programmed these machines REALLY knew what they were doing and probably forgot more about efficient programming and code compression than todays "top" coders ever knew in the first place.
    • by Krokus ( 88121 ) on Thursday December 12, 2002 @07:29AM (#4869357) Homepage
      Lets see even the best programmers used to point and drool GUI interfaces and hand holding wizards try and do that in 2K now using little more than paper tape! The people who designed, built and programmed these machines REALLY knew what they were doing

      I'm pretty sure they would not have snubbed their nose at the idea of being given a machine that had a GUI interface and piles of RAM and storage. Oh, to be able to focus on the problem at hand and not have to be distracted by the limitations of a 64 byte stack!

      To belittle the programmers of today because they have not suffered the restrictions of yesteryear is a bit silly. Even today, there are embedded systems programmers who still deal with such restrictions. Do we elevate them to deity status? No, we just sit back and wait for Carmack to speak.

      • In reply I would merely point you to the bloatware that exists today on all systems. You call that the work of efficient (read good) coders? I don't.
        • "Bloatware" does exist, but not all large programs are bloatware.

          Would you call Quicksort "bloatware" because it uses more stack space than bubblesort?

          The increase in size (both code and memory foorprint) of applications is usually accompanied by better reuse, extensibility, portability and speed (better algorithms).

          I doubt the programs you had to hand code in asm for a 2kb machine could be extended very easily.

          Today, we have so much memory and CPU power that we can 'waste' it on stuff like Java, COM, XML etc to make programming reusable components easier.
        • by fusiongyro ( 55524 ) <faxfreemosquito@@@yahoo...com> on Thursday December 12, 2002 @08:46AM (#4869731) Homepage
          To back up the parent of the previous post, I know someone who has been programming since the late sixties or early seventies. While that may not be quite 50 years ago, it certainly means he has had some experience with tape storage, even punch card FORTRAN initially, and probably worse though he doesn't talk about it very often.

          What does he do now? He is still an application writer, his language of choice being Python and his file format of choice being XML. Frankly, I think this is quite telling: his opinion if I understand it correctly is that since we have the power, we shouldn't waste time writing things lower level than necessary. By using Python and XML he's far, far removed from the ordinary perils of yesteryear like memory management, pure procedural programming, even memory and disk size limitations.

          And frankly, while those things are difficult to deal with, they're also very rote and don't leave much expression to the software engineer. People who favor C and to some extent C++ usually admit that there is some pleasure in the sheer amount of control in using the language; it's my opinion that people using Perl, Python and to some extent Java are the people reading books with "Practical" or "Design" in the title, and that's really a better way to do things.

          In reply I would merely point you to the bloatware that exists today on all systems. You call that the work of efficient (read good) coders? I don't.

          It's an easy attack to make, with some degree of merit. The qualifications for being a coder these days are certainly less strict than they were at one time. However, the observation of the post you were replying to was that the older systems had less to do than modern ones. When you resize your browser window you're doing an operation that, as far as a 386 would be concerned, is non-trivial. Add to that the sheer size of the parsed webpage which generates the pretty view you see, and you've got yourself a lot of graphical things to do, and a huge datastructure in RAM. This is not the kind of problem that can be solved simply by being able to manually manage memory from assembler. This is the kind of problem that requires an intelligent design from the get-go, so that optimizations can be placed in the places where they are required as needed.

          Bloatware? Probably. People who needed computers for whatever reason seemed to be getting along with them just fine without GUIs, or multiprocessing, or realtime 3D games. All of these additions is going to consume resources both when written and when used. I won't argue with you that Windows would have been better if it were based on a clean design. Clearly it would have, and on Linux we now have many desktop systems based on (if not a good deal more forethought) at least the trial-and-error process that produced the early GUIs done with a faster turnaround. Unfortunately, the users have come to rely on GUIs, pretty widgets, and browsers that resize. If they were not, perhaps we could cut down on the code quite a bit.

          Also, one thing about my friend I mentioned earlier: while his code is extremely well-designed, he seems to have a fundamental lack of understanding of ideas such as UI design and concurrency. None of his programs as far as I have seen have used threading, even the GUI ones, and the few GUI programs I have seen were beyond the ugliness I expect from TK. He wrote an abstraction layer for a database that implemented foreign key constraints, and was at a bit of a loss when I first tried to explain to him that it wouldn't carry over necessarily if multiple copies of his application were running simultaneously. So we all have these problems, and we all try to get better.

          If you want to see well designed and implemented code, I recommend you pick up a copy of BeOS. By sacrificing backwards compatibility, they managed to create an operating system from scratch based on object-oriented principals. It's quite amazing when you realize the things that you could do with it that you couldn't do with Windows, yet it was a tiny fraction of the size of Windows when fully installed. For example:

          1. Active queries. Linux acquired something similar via FAM but you need application support for it. Basically, you could search for files based on their attributes, and as files were removed or added to the system they would disappear or appear in the query. The query could be used like a directory for all programs that could access one (AFAICT).
          2. Device drivers took effect immediately upon placing them in the appropriate system directory (except display drivers).
          3. Applications were tiny - the HTML 3.0 compliant browser came in at under a meg for the whole binary. I never saw an app larger that 5 MB.
          4. A full install came in at about 300 MB (comparable to OpenBSD) IIRC.


          Now I'm going to get some sleep and try to forget about the sorry state of computing we're in right now.

          --
          Daniel
        • The values have changed.

          In the past, memory and storage were expensive and limited, and processors were slow.

          Today, Memory and storage are cheap, and processors are fast.

          This has changed the focus. In the past, it was important to get as much done as you could with as little as possible. Today, we sacrifice a bit (perhaps too much) of that because we can afford to. The prime measure of efficiency today is code readablility and reusability. Why do you think OOP is so big these days? Do you think it's more efficient, in the traditional sense of making smaller, faster programs that use fewer resources? Not at all. There are optimizing compilers that do a good job of making OO code efficient, but it sure isn't inherent in the design of the languages. That's not the focus. The focus is on readable, reusable code.
    • by Anonymous Coward on Thursday December 12, 2002 @07:47AM (#4869435)
      Yeah? Well at least you HAD ones. We had to make do with zeros. Ever code in unary?
      • Sadly enough, yes.

        It wasn't major, really... just a Turing machine project for a homework assignment. It calculated the function y = 2x + 1. In unary, of course.

        Strangely enough, writing Turing machines didn't greatly increase my appreciation of 0s. My appreciation for having an instruction decoder, however, went through the roof.
    • by Tablizer ( 95088 ) on Thursday December 12, 2002 @01:25PM (#4872261) Journal
      Lets see even the best programmers used to point and drool GUI interfaces and hand holding wizards try and do that in 2K now using little more than paper tape! The people who designed, built and programmed these machines REALLY knew what they were doing and probably forgot more about efficient programming and code compression than todays "top" coders ever knew in the first place.

      I remember the good ol' days before lawn mowers were invented. We would stoop over the lawn for weeks with tweezers in hand. Each grass blade was skillfully cut by a true craftman. Now your "best" lawn mowers simply buzz through a yard, never even seeing individual grass blades.
    • Don't older computers than this run air traffic systems in airports?
    • Is it Y2K compliant?
    • And last and least, imagine a Beowolf cluster of these.
    And BTW, at the link mentioned, they are questioning whether the computer is even running: "From what I read in the links the computer would definitely not work if powered, instead it would probably catch fire".
  • by Anonymous Coward on Thursday December 12, 2002 @07:18AM (#4869311)

    god I feel old...

    Years ago, when I worked at the CSIRO I worked on this machine for a while. I'm amazed it didn't die long ago. It used RPN for calculations, which takes getting used to, but is far better then algerbraic.

    It's processor (not CPU - it consisted of multiple chips) is a hardware FORTH type. The jokes about FORTH programmers are true!

    • FORTH is nowhere near that old. From http://www.forth.com/Content/History/History1c.htm #1.1 [forth.com]:

      Forth was invented by Charles H. (Chuck) Moore.


      A direct outgrowth of Moore's work in the 1960's, the first program to be called Forth was written in about 1970.

      The CSIRAC was a vacuum tube based machine. From http://www.cs.mu.oz.au/csirac/design.html [mu.oz.au]:
      CSIRAC was, of course, a vacuum-tube machine; most of its 2000 tubes were 6SN7, 6V6, EA50 and KT66. Eventually button-based tubes were used in the delay line store electronics, germanium diodes and, much later, George Semkiw re-designed the disk read electronics using germanium transistors.

      And on top of that, ICs weren't invented until 1958.
  • Just imagine how big a computational problem could be solved in 50 years with contemporary P4 hardware, if only Intel would build its hardware to last 50 years. ...Or anything over 5 years - for that matter...

    Now this may not be a problem for home users that buy a complete new system every two-three years (regardless of environmental effects), but I'm sure happy they don't sent out space probes which rely on today's state-of-art.

    --
    The most likely way for the world to be destroyed, most experts agree, is by accident. That's where we come in: we're computer professionals, we cause accidents -- Nathaniel Borenstein
    • Uh, I understand what you're trying to say but... I've never seen a CPU fail in my entire life.
      Usually when a CPU fails it's as a result of an accident, such as heat sink falling off or excess voltage being applied. I'm not saying it can't happen, I'm just saying you shouldn't be too quick to bash Intel for making unreliable chips.
      As for space-faring hardware, it's custom built both to last and to resist radiation... surely we wouldn't use a P4 on a space probe, but it's more due to its huge power consumption than any inherent unreliability. I'm sure if NASA wanted to send out some P4s, Intel could very well provide suitable chips.
      Note that I'm not at all Intel biased, I run a 1333 Tbird.
    • Just imagine how big a computational problem could be solved in 50 years with contemporary P4 hardware.

      No need to imagine. Suppose we round Moore's Law to a simplistic "double every year", which is about right. (Processors may not move that fast but remember it's the whole computer that affects processing time; add up processor advances, disk advances, memory advances, graphical advances etc. and you get probably more then a doubling per year, so this is conservative.)

      I can start my 50-year computation on my P2000 (processor 2000, not Pentium 2000) in 2000 and be done in 2050.

      Or I can wait a year and buy the P2001 and be done in 25 years, in 2026.

      Or I can wait two years and buy the P2002, and be done in 12.5 years, in 2014.5.

      Or I can wait three years and buy the P2003, and be done in 6.25 years, and finish in 2009.25.

      Or I can wait four years and buy the P2004, and be done in 3.125 years, and finish in 2007.125.

      If I wait five years and buy the P2005, I can be done in 1.0625 years, and finish in 2006.0625.

      If I wait six years and buy the P2006, I can be done in .53125 years, and finish in 2006.53125.

      Because of the continuing exponential growth in power, the value of keeping a fifty-year-old processor online for fifty years is nearly zero once you get past the first few years. Note the P2050 finishes your P2000-50-year task in 50/(2^50) years, or .00000000000004440892 years, which if my calculations are correct is 1.4 nanoseconds. Actually I think computational power bottoms out before then, but the principle holds. More specifically to your post, the value of the "contemporary P4 hardware" over 50 years is effectively negative; instead of waiting 50 years, you could have spent the same amount of dough and been done in a mere six years! Until we stop exponentially advancing, the value of old chips drops like a rock until they are nearly worthless in a mere 3 to 4 years for any serious long-term computation.

      This isn't just theory, either; for some computations, it is more cost-effective to wait for better computers. The constants in the analysis of the first part of this message changes (usually an analyst would look at "spending $X" rather then "buying one computer"), but it works out the same. Sometimes you're better off waiting.

      Now, for some people in some situations, practically, old computers can be useful. Don't extend my post past the context I've placed it in. I've got a happily cranking 233MHz P1 at home... but I don't do weather simulations on it for profit, I use it for some web scanning as a personal use in preference to throwing it out. (Even so, in ten years or so, it would be cheaper to turn it off and buy a lower-power-consumption computer...)
  • by anethema ( 99553 ) on Thursday December 12, 2002 @07:21AM (#4869324) Homepage
    "A lot of its components would not stand having voltages applied to them again," says Thorne. "I think it would probably catch fire."

    Reminds me of some of the old linux kernel code, and thinking its good to have a sense of humor.

    Trying to get a printer working and getting a kernel message saying Lpt:1 on fire!
  • by h4mmer5tein ( 589994 ) on Thursday December 12, 2002 @07:24AM (#4869336)
    The original story [abc.net.au] appears to have come from Australia's ABC Televison [abc.net.au] and reports that :

    " Sadly, it's not an option to make CSIRAC operational again today. Time has taken a toll on this fragile dinosaur.

    So what exactly would happen if anyone tried to relive the magic by switching it on?

    "A lot of its components would not stand having voltages applied to them again," says Thorne. "I think it would probably catch fire."

    • The actual hardware is dead, but as I've said elsewhere an emulator does exist and many of the old programs have been recorded.
  • by Ed Avis ( 5917 ) <ed@membled.com> on Thursday December 12, 2002 @07:24AM (#4869337) Homepage
    300kHz may not sound like much, but with overclocking and a decent watercooling setup you could crank it as high as 334kHz!
  • Unclear (Score:5, Insightful)

    by Russellkhan ( 570824 ) on Thursday December 12, 2002 @07:25AM (#4869344)
    The Geek.com article says:
    " A half-century old computer, called CSIRAC, is still operating in Australia. The computer, which was Australia's first, ran at a blistering 300 kilohertz, had 2 KB RAM, and 2.5 KB storage."

    But the Inquirer article [theinquirer.net] linked by the above Geek.com article says:
    "The machine was the fourth computer to be built anywhere in the world, ran at 0.001MHz, and had a massive 2000 bytes of memory and a behemothic 2500 bytes of storage."

    Which, by my calcuations, would be 1000 hertz or 1 kilohertz. I tend to believe the Inquirer, since they're running the source article. And besides, the 1977 Apple ][ was only 1 MHz, Don't you think there was a bit more progress than less than doubling in processor speed from 1949 to 1977?
    • According to my book on CSIRAC, the initial time for each instruction was 2 millisecond (which would give a "clock speed" of 500 hz), and was speed up to 1 millisecond per instruction (1000 Hz) in 1962.

      Of course, by 1962 CSIRAC was years behind the state of the art.

      • by heikkile ( 111814 ) on Thursday December 12, 2002 @08:11AM (#4869532)
        That is instruction times, not clock pulses. My first computer, (way later in 1977) had a clock of 1.75MHz, but it took 16 clock pulses for most instructions and 24 for the rest... It too had 2KB of memory, and room to add another 2K, "if someone could find use for all that memory" as it said in the instruction book...

        I sold a few programs for the beast on 2KB EPROMS. There can be quite much stuff in 2K. (for example an editor + assembler + disassembler). Once I added almost 500 bytes in a 2K program, and optimized it back into a 2K chip. Talk of ugly coding, used all the tricks I knew (reusing jump addresses for instructions, self-modifying code (written backwards in the rom to save a byte in copying it into ram), jumping into unrelated routines to reuse 4 bytes of the exit code, you name it. All done in pure hex... Man, those were the days...

        • That's true. According to the book I've got, the CPU of CSIRAC was synchronized to the mercury delay lines, which completed a cycle in about 1 millisecond, so I suppose you could call the clock rate 1 kHz. Each instruction took either 2, 3, or 4 memory cycles to execute (the initial design had every instruction take 4 cycles, but an improved control unit design took advantage of cases where that wasn't necessary). Hence, the machine ran at about 500 instructions per second.
    • "Which, by my calcuations, would be 1000 hertz or 1 kilohertz. "

      With a clockspeed of 1000 hertz you'd actually be able to hear the thing go "OOOOOOOUUUUUOOOOUOOUUUUOOOOOUUUOUOUOOOOOO".

      Man that !has! to have sucked completely to be a developer back then: "WATCHA SAYIN'?? I GOTTA WHAT??? CHANGE THE POINTER?? I !CAN'T! !HEAR! YA!!"

  • My first computer (10yrs or more) was a 386 with a 5quater drive and a 100mb hdd, 640k ram and no mouse. Windows 3.1 was going to happen. Yah, it all sounds funny now, but with the recent remark that Moore's law may soon become obsolete, we may not get MUCH faster computers on current technology. I guess we'll have different architectured.... quantum computing and DNA computing seem to be hot areas.
    A question though, was it just built 50 yrs back, or has it had an up-time of 50yrs ????? :p
    • strange config.. i mean, we got our amstrad xt with 640kb and 20mb, and even it had mouse :).

      though, back then(late 80s..), they sold 8086's, 286's and 386's at the same time, that's like if they sold pentium mmx's still for the home user and p4's for 8000$ to pro's..

      we had pesky 8mhz 8086, my friend had 12mhz at 286, and another friends father had 386 on ibm tower. that was cool.

  • by heytal ( 173090 ) <hetal.rach@gmaRASPil.com minus berry> on Thursday December 12, 2002 @07:33AM (#4869369) Homepage
    Check out this page [csiro.au] which tells us the history of the said computer. In the end, it says the following:

    Following the University of Melbourne's purchase in 1964 of a Control Data 3200 from the USA, CSIRAC was donated to the Museum of Victoria. At this time it was realised that CSIRAC was the oldest computer still in operation, and worthy of preservation so it was carefully dismantled and stored.
    CSIRAC is now the centre-piece of the IT display at the Museum in Melbourne.

    • Replying to own post, but couldn't resist..

      The article here [theinquirer.net] says that it's running.

      The machine was the fourth computer to be built anywhere in the world, ran at 0.001MHz, and had a massive 2000 bytes of memory and a behemothic 2500 bytes of storage.

      And it's still running, now safe in the Melbourne Museum, in Australia.


      Maybe they too don't check facts before reporting.
      But yes, they are different in the sense that they do have a spell checker ;-)
  • by Goonie ( 8651 ) <robert.merkel@be ... g ['ra.' in gap]> on Thursday December 12, 2002 @07:35AM (#4869377) Homepage
    Though CSIRAC is still basically complete, it will never be turned on again. To get it working again would require much wiring to be fixed and a whole bunch of vacuum tubes to be replaced - otherwise, it would be a huge fire risk. However, in the process, you'd destroy much of the historical value of the thing. There's not much point to turning it on again anyway. An emulator was written for it some time ago, and all the old programs that could be located have been transferred and can now be run on the emulator. Ah, the wonders of the Church-Turing thesis...

    As I understand it, the music was recorded by building a replica of the sound hardware and connecting it to the emulator. People who heard the music have confirmed it sounds pretty much like the original in 1955 (IIRC, it was around that time).

    Perhaps the coolest thing that they did with CSIRAC was build a HLL and compiler for it, which they called Autocoder IIRC. It looked like a cross between FORTRAN and BASIC and avoided some of the thinkos of FORTRAN, as far as I could tell.

    CSIRAC is now permanently on display at the museum in Melbourne, Australia. It's the only complete, original machine of its generation in existence, and well worth a look if you come down our way. There is also a book on CSIRAC called "The Last of the first", which is a fascinating read if you can get your hands on a copy.

    One of my university lecturers, Peter Thorne, got his start in computers as an operator for the machine. He met his wife there - she was a fellow computer operator!

  • by bgog ( 564818 )
    The story states that it is still operational. If you follow the links, at the end of the the big write-up they ask a what would happen if someone tried to power it up. The reply was "probably catch fire".
  • running the search engine on Sourceforge, right?

  • by imag0 ( 605684 ) on Thursday December 12, 2002 @07:45AM (#4869429) Homepage
    Who's there?

    *60 second pause....*

    CSIRAC!
  • Pioneers (Score:3, Funny)

    by muzzmac ( 554127 ) on Thursday December 12, 2002 @08:19AM (#4869573)
    One of my neighbours helped to build CSIRAC. My guess is the computer looks better than he does.

    Great old guy. His wife does a great pumpkin scone.
  • we have a working IBM 1130 here (and the IBM engineer that it was assigned to...) hehe
    www.aconit.org
  • > they really don't make [computers] like they used to

    If automobiles had evolved at the same rate as computers we would all be driving Jaguars that went 250 miles an hour, got 500 miles per gallon, cost $1000, and self-destructed once a year, killing all of the occupants.
  • 50 years at 300KHz (Score:3, Informative)

    by allanc ( 25681 ) on Thursday December 12, 2002 @10:19AM (#4870409) Homepage
    Okay, some quick math:

    50 years * 366 days/year (rounding up) * 24 hours/day * 60 minutes/hour * 60 seconds/minute * 300000 cycles/second = 4.74336e14 cycles

    Now, my Athlon XP 1600:
    4.47336e14 cycles / 1400000000 Hz / 60 sec / 60 min = Roughly 89 hours

    So even if this machine were still running (which, incidentally, it's not. RTFA), in terms of pure cycles of functionality pulled out of the machine, my Athlon beat it in the first four days. It's a lot easier to maintain a pair of shoes than it is an airplane. And of course, this machine ISN'T still running, and would likely execute an HCF instruction (Halt and Catch Fire) if powered on, so you really can't call it reliable.

    (Of course, my Athlon's running Windows (needed a games machine), so it's debatable whether or not these cycles have actually been functional...)

    --AC
  • by peter303 ( 12292 ) on Thursday December 12, 2002 @10:28AM (#4870487)
    I would have thought many of them would no longer be manufactured. (Computers went solid state- discrete transistors- in the late 1950s and integrated circuits in the early 1970s.)
  • Here's [piercefuller.com] a brief page about some ibm tube logic modules, schematic for an 'inverter', etc. Anybody with an old 650 laying around I'll gladly cart it off for you.
  • A few things that should be considered along with the lifetime of current PC's

    a) Heat and dissipation: They run hot as hell. Yeah, this was filled with Vacuum tubes and probably got fairly warm as wellone probably got fairly warm as well, but in modern PC's the heat tends to be focussed over particular components, leading to detioration over time.

    b) Moving parts: Fast-spinning hard drives, fans (see heat, above), etc. The more moving parts you have the greater chance of failure. It also takes more power impulses to start a motor spinning up (hard drive, CD-ROM).

    c) Expected time of usage: We're going through PC's a lot faster than we used to. How long was CSIRAC in use? For most home users, you can usually expect an upgrade at least every 5 years. Perhaps not a new PC, but at least a component. Why build a PC that's going to last forever if it's going to be obsolete very soon - except for consideration to servers, etc.
    • One issue to worry about even if you keep your PC in storage over time is the ROMs. Many PCs use EPROM or flash BIOS. These work on the principle of injecting extra electrons into an insulator and keeping them trapped there. Over time, however, they might slowly leak out and erase the BIOS.

      I don't know if they've improved the specs lately, but IIRC 10 years ago (the last time I was designing hardware), the EROM chip makers didn't guarantee that they would hold data much longer than a decade.

      • Are these coupled with burn-ins? If it had a partially programmable BIOS coupled with a fixed one, it would be fine so long as you could boot in the CSOM settings to reconfigure everything after the decade is up.

        CMOS batteries don't last that long anyways though.
  • of the lightbulb that's been on for 100 years [thekcrachannel.com].

    (You can look at it online [centennialbulb.org] if you want)

  • I'm sure it would power up in the morning and clear its throat for about five hours, then go in the bathroom for two, then have salt on its oatmeal.

    And all the while, there's a Sun machine thinking "Why can't you just short?! Short and be done with it!"

    I'm having an episode!

  • Has somebody told the NetBSD [netbsd.org] folks about this one? (Read 2nd line from the top center of the link if you don't get it.)

    There will be a celebration to jointly celebrate it's 50th anniversary and it's completion of calculating pi to the 4th digit.
  • This is so lazy (Score:2, Interesting)

    by MHV ( 547208 )
    There is something I find annoying with Slashdot, it's the bad habit of posters to leech news from other sites that already refer to a previous coverage on another site. This is absurd: I click on Slashdot's link to go to geek.com's link [geek.com], which sends me to The Inquirer [theinquirer.net], from which I can finally have the real thing [csiro.au]. Is this only me that is irritated or what? Hey, when I read the same news first on OSNews [osnews.com] (who at least have the decency to redirect to original sources more often) and that some hours after I see that same story on Slashdot, but with the link pointing to OSNews, I find that a bit ridiculous. Not that I think it wrong to acknowledge that news posted on Slashdot came from another news aggregator (that's how one learns about the other ones), but the point is that you end up with a neverending arab telephone, and the guy down the line says black when you're posting white. Or else it's a new way to counter the slashdot effect, and I'm not just getting it.

  • the article mentions IBM's digital computer in america,
    but doesn't mention that the first digital computer (the 'ZI') was designed in germany by: KONRAD ZUSE:

    Konrad Zuse - Mark I [epemag.com]

  • It's a tube computer, so it HAS to sound better then a solid state one.
  • Oh Woohoo! (Score:2, Funny)

    by fiesty ( 525673 )
    Where can I go to buy one of these. I need to heat by back alley. I'll strech the power from the warehouse, nice and toasty cardboard box.
  • The trick... (Score:3, Insightful)

    by KC7GR ( 473279 ) on Thursday December 12, 2002 @02:05PM (#4872651) Homepage Journal
    ...is to combine the best ideas from old and new technology alike, and blend them into an entirely new result. That, to my eyes, is what real "innovation" or R&D is all about.

    Some examples: DEC (Digital Equipment), in their heyday, came up with some great techniques for memory management at the hardware level. I'd be curious to know how many of those ideas got rolled over into more current stuff.

    Another one; Where would we all be if Xerox's PARC facility had never come up with what has morphed into today's electronic rodent? Heck, IBM was using light pens years before that.

    In short; You don't want to just ignore something because it's "old" or "obsolete" (Essence, I loathe that word!). You need to take the good ideas from the old stuff and build on them.

    Somehow, I doubt that we would have so many tons of electronic junk choking landfills today if computer and electronics hardware was (a), really built to last, like the old stuff was; And (b), built to be easily upgradeable.

  • According to my calculations, if you were to port Unreal Tournament to this machine, you would be able to get 1 frame every ten days!
  • CISRAC photo (Score:3, Informative)

    by ReadParse ( 38517 ) <john@IIIfunnycow.com minus threevowels> on Thursday December 12, 2002 @02:28PM (#4872877) Homepage
    Here's a photo [abc.net.au]
  • by Muad'Dave ( 255648 ) on Thursday December 12, 2002 @02:32PM (#4872916) Homepage
    continuously for the past 50 years, it would've performed 4.734e14 instructions. Your newfangled 3.3GHz processor performs that many instructions in 39.85 HOURS.

  • Think about it. That 300 KHz computer may still be running after 50 years, but those 50 years of CPU time add up to about 43 hours of CPU time on a 3.06 GHz Pentium 4. And that's just clock cycles; the Pentium 4 probably gets far more instructions per clock cycle. And, of course, the on-chip cache on the Pentium 4 far exceeds the 2 KB of RAM on that 50-year-old machine.

    All in all, today's fastest Pentium could easily exceed the lifetime processing power of the CSIRAC in just a few hours, at a tiny fraction of the cost. Sure, it's cool that the computer still runs after 50 years, but let's put it into perspective here -- we get far more computing power out of modern chips, even if they fail within a couple years! Longevity isn't everything...

"Look! There! Evil!.. pure and simple, total evil from the Eighth Dimension!" -- Buckaroo Banzai

Working...