50 Year Old Computer Still Going 293
The Angry Mick writes "Geek.com is running a blurb on a 50 year old CSIRAC computer that is apparently still functional, if lurking in an Australian museum. Sporting a whopping 2K of RAM and screaming along at a blistering 300 khz(!) it proves the adage that they really don't make 'em like they used to . . ." Yes, because if they did, they'd be really, really slow.
I wonder... (Score:2, Insightful)
Re:I wonder... (Score:2, Funny)
With a 50 year ceiling fan. Probably similar to the ones in those old barber shops.
Wow... (Score:3, Funny)
Lurking (Score:2)
Imagine... (Score:5, Funny)
Re:Imagine... (Score:3, Funny)
Re:That's when the joke is the ripest! (Score:3, Funny)
When I'm at my wittiest, she just sort of rolls her eyes, groans, and goes back to whatever she was doing.
Re:Imagine... (Score:2)
built to last (Score:5, Interesting)
Re:built to last (Score:2, Funny)
It wasn't that reliable... (Score:5, Funny)
On one occasion, they gave a demo to an organisation called the Institute of Radio Engineers (IRE), but apparently a memory error occurred and the thing printed "CSIRAC welcomes the members of the IRA) :)
Re:It wasn't that reliable... (Score:5, Funny)
Re:It wasn't that reliable... (Score:2)
Re:built to last (Score:2)
So, realistically, we can expect "not many" of the current computers to survive 50 years, especially without the regular maintenance I'm sure that old beast receives.
It's just like humans, really. The life expectancy goes up as a person gets older. Sort of a "made it this far, will probably make it farther" thing.
Re:built to last (Score:2, Funny)
Re:built to last (Score:2)
Maybe it's because they were built in the UK, before the age of massive 3rd world outsourcing?
A friend of mine had an 8086 that worked fine on 512k and 20 megs of disk, and i have no doubt that i will be married and with children before my 12 yr old 486-33 hits the dumpster...
Re:built to last (Score:4, Funny)
Haven't been married before have you? (Pictures of ex wife seeing my junky old 486 sitting in the corner and exclaiming, "What's that???" flash through my mind) Basically you can kiss anything over 10 years old goodbye (with the possible exception of family heirlooms studded with diamonds).
Running eh? (Score:5, Insightful)
By reading the horde of nested articles, I got the impression that the machine hasn't run in decades, and probably would not if powered.
Correct me if I'm wrong. But please quote a piece that says it is actually running now.
Re:Running eh? (Score:2, Informative)
Re:Running eh? (Score:5, Informative)
The original source [abc.net.au] even says it cannot run.
It was the hairbrained TheInquirer [theinquirer.net] article writer who somehow got the impression that it was still running.
Re:Running eh? (Score:2)
ran at 0.001MHz, and had a massive 2000 bytes of memory and a behemothic 2500 bytes of storage.
Those hairbrains seem incredulous that it runs at less than 1 GHz or has less than 128 MB of memory or 80 GB of storage. I got the impression that they weren't really reporting so much on the age of the computer as how slow it iswas. Kids these days...
Re:Running eh? (Score:5, Informative)
Someone's comment, your quote. It's actually more likely that if they were to power it up/were powering it up, they apply voltage gradually to allow the electrolytic capacitors to re-form and the getter rings/compounds [demon.co.uk] in the thermionic devices to restore vacuum.
It's not unusual for thermionic equipment to survive long periods of time without use. There is still radio equipment from this era running strongly in museums and private collections and, dare I say, in everyday use. The odd capacitor may fail short once in a while, resistors may fail _high_ (they gradually increase resistance with time - a knownphenomenon) or valves/tubes may lose a heater or go "soft" but I think it's stretching the imagination somewhat to expect it to burst into flames.
Incidentally, designers from this era often made their chassis live (high potential with respect to ground) so the only thing I'd expect to catch fire would be the young PFY geek leaning on it to get a better view of the thermionics powering up and starting to glow...;o)
Re:Running eh? (Score:2)
This thing is just about the best built and coolest looking piece of equipment I've ever seen. The black steel cabinet is just about indestructable. The glowing green dials and big metal toggle switches look great. The all-tube electronics turns radio static, morse code and beat frequencies into a wonderful eerie ambience that's always changing as you twiddle the knobs.
That has more ram than my present CPU has (Score:3, Funny)
Although I have the advantage of having a whopping 64k of ROM. I only have to use the RAM for my data. I would expect that computer also has to store the program binary in the 2k. Overlays, anyone?
Lately I've been finding it worth my time to spend a few hours recoding some functions in order to shave just a few bytes off their stack usage.
Kids these days, assuming everyone's got 128 megabytes for their application. They just don't code 'em like they used to.
Re:That has more ram than my present CPU has (Score:2, Funny)
Re:That has more ram than my present CPU has (Score:4, Interesting)
I don't even notice unless an app is using over 100M (technically, 100,000KB, but who's counting?)
But it sure would be nice if Windows would notice I have gobs of RAM lying around and start using it for something productive like caching the disk subsystem, rather than the other way around. There is no excuse for a system with >512M of free RAM paging to disk! What ass-backwards VM got stuck into Windows, anyway?
Yup, me too (Score:2)
Re:That has more ram than my present CPU has (Score:2)
Impressive (Score:3, Insightful)
Commodore released the 64 in 1982, this puts it at 20 years of age. That's 30 years between these two machines. When did Moore make that law again?
Yikes, imagine what the computer world will be like in 30 years time! Assuming MS haven't screwed it up for everyone.
Re:Impressive // dollars? (Score:5, Informative)
Btw, C64's feature 64kB which is 32 times 2kB, so at least memory size doubled five times in 20 years, that is: it doubled every four years.
--
In theory there is no difference between practice and theory. But in practice there is -- Jan L.A. van Snepscheut
Re:Impressive // dollars? (Score:5, Informative)
The UK's weather bureau give specs on the Cyber 205 they were using in '82:
http://www.met-office.gov.uk/research/nwp/numer
CDC Cyber 205
200Mhz Clock
1 MegaWord of memory
The Cyber had a 64 bit word size so that amounted to 8 MB of ram. So clockspeed has increased over 600 times and memory has increased over 4000 times in that time frame. This is just confining myself to the 205. I didn't look for the specs on other large machines like the Crays that were availiable then.
Computers as something just anyone could play with were pretty much nonexistant prior to 77 (true you could build something ENIAC-like anytime in the seventies if you were REALLY good with electronics). It's more instructive to see what the kind of money they had to spend on the CSIRAC will get you as time moves forward. Power comparable to the C64 was availiabe in the early sixties for that kind of money.
Re:Impressive // dollars? (Score:2)
CSIRAC played the world's first computer music (Score:5, Informative)
Not bad for a living dinosaur. Listen to it [abc.net.au] yourself :)
Re:CSIRAC played the world's first computer music (Score:3, Informative)
It basically sounds like my old spectrum, only a bit worse
I wonder what all the background noise on it is, though, it certainly sounded like they had a massive computer in the room while playing it.
lies, all lies... (Score:5, Informative)
These computers are not to be laughed at (Score:5, Insightful)
Re:These computers are not to be laughed at (Score:5, Insightful)
I'm pretty sure they would not have snubbed their nose at the idea of being given a machine that had a GUI interface and piles of RAM and storage. Oh, to be able to focus on the problem at hand and not have to be distracted by the limitations of a 64 byte stack!
To belittle the programmers of today because they have not suffered the restrictions of yesteryear is a bit silly. Even today, there are embedded systems programmers who still deal with such restrictions. Do we elevate them to deity status? No, we just sit back and wait for Carmack to speak.
Re:These computers are not to be laughed at (Score:2, Insightful)
Re:These computers are not to be laughed at (Score:2, Interesting)
Would you call Quicksort "bloatware" because it uses more stack space than bubblesort?
The increase in size (both code and memory foorprint) of applications is usually accompanied by better reuse, extensibility, portability and speed (better algorithms).
I doubt the programs you had to hand code in asm for a 2kb machine could be extended very easily.
Today, we have so much memory and CPU power that we can 'waste' it on stuff like Java, COM, XML etc to make programming reusable components easier.
Re:These computers are not to be laughed at (Score:5, Insightful)
What does he do now? He is still an application writer, his language of choice being Python and his file format of choice being XML. Frankly, I think this is quite telling: his opinion if I understand it correctly is that since we have the power, we shouldn't waste time writing things lower level than necessary. By using Python and XML he's far, far removed from the ordinary perils of yesteryear like memory management, pure procedural programming, even memory and disk size limitations.
And frankly, while those things are difficult to deal with, they're also very rote and don't leave much expression to the software engineer. People who favor C and to some extent C++ usually admit that there is some pleasure in the sheer amount of control in using the language; it's my opinion that people using Perl, Python and to some extent Java are the people reading books with "Practical" or "Design" in the title, and that's really a better way to do things.
In reply I would merely point you to the bloatware that exists today on all systems. You call that the work of efficient (read good) coders? I don't.
It's an easy attack to make, with some degree of merit. The qualifications for being a coder these days are certainly less strict than they were at one time. However, the observation of the post you were replying to was that the older systems had less to do than modern ones. When you resize your browser window you're doing an operation that, as far as a 386 would be concerned, is non-trivial. Add to that the sheer size of the parsed webpage which generates the pretty view you see, and you've got yourself a lot of graphical things to do, and a huge datastructure in RAM. This is not the kind of problem that can be solved simply by being able to manually manage memory from assembler. This is the kind of problem that requires an intelligent design from the get-go, so that optimizations can be placed in the places where they are required as needed.
Bloatware? Probably. People who needed computers for whatever reason seemed to be getting along with them just fine without GUIs, or multiprocessing, or realtime 3D games. All of these additions is going to consume resources both when written and when used. I won't argue with you that Windows would have been better if it were based on a clean design. Clearly it would have, and on Linux we now have many desktop systems based on (if not a good deal more forethought) at least the trial-and-error process that produced the early GUIs done with a faster turnaround. Unfortunately, the users have come to rely on GUIs, pretty widgets, and browsers that resize. If they were not, perhaps we could cut down on the code quite a bit.
Also, one thing about my friend I mentioned earlier: while his code is extremely well-designed, he seems to have a fundamental lack of understanding of ideas such as UI design and concurrency. None of his programs as far as I have seen have used threading, even the GUI ones, and the few GUI programs I have seen were beyond the ugliness I expect from TK. He wrote an abstraction layer for a database that implemented foreign key constraints, and was at a bit of a loss when I first tried to explain to him that it wouldn't carry over necessarily if multiple copies of his application were running simultaneously. So we all have these problems, and we all try to get better.
If you want to see well designed and implemented code, I recommend you pick up a copy of BeOS. By sacrificing backwards compatibility, they managed to create an operating system from scratch based on object-oriented principals. It's quite amazing when you realize the things that you could do with it that you couldn't do with Windows, yet it was a tiny fraction of the size of Windows when fully installed. For example:
Now I'm going to get some sleep and try to forget about the sorry state of computing we're in right now.
--
Daniel
Re:These computers are not to be laughed at (Score:3, Insightful)
In the past, memory and storage were expensive and limited, and processors were slow.
Today, Memory and storage are cheap, and processors are fast.
This has changed the focus. In the past, it was important to get as much done as you could with as little as possible. Today, we sacrifice a bit (perhaps too much) of that because we can afford to. The prime measure of efficiency today is code readablility and reusability. Why do you think OOP is so big these days? Do you think it's more efficient, in the traditional sense of making smaller, faster programs that use fewer resources? Not at all. There are optimizing compilers that do a good job of making OO code efficient, but it sure isn't inherent in the design of the languages. That's not the focus. The focus is on readable, reusable code.
Re:These computers are not to be laughed at (Score:5, Funny)
Unary code (Score:2)
It wasn't major, really... just a Turing machine project for a homework assignment. It calculated the function y = 2x + 1. In unary, of course.
Strangely enough, writing Turing machines didn't greatly increase my appreciation of 0s. My appreciation for having an instruction decoder, however, went through the roof.
Re:These computers are not to be laughed at (Score:5, Funny)
I remember the good ol' days before lawn mowers were invented. We would stoop over the lawn for weeks with tweezers in hand. Each grass blade was skillfully cut by a true craftman. Now your "best" lawn mowers simply buzz through a yard, never even seeing individual grass blades.
All the lame questions/jokes (Score:2)
I have used this machine! (Score:3, Interesting)
god I feel old...
Years ago, when I worked at the CSIRO I worked on this machine for a while. I'm amazed it didn't die long ago. It used RPN for calculations, which takes getting used to, but is far better then algerbraic.
It's processor (not CPU - it consisted of multiple chips) is a hardware FORTH type. The jokes about FORTH programmers are true!
Full of holes (Score:2)
The CSIRAC was a vacuum tube based machine. From http://www.cs.mu.oz.au/csirac/design.html [mu.oz.au]:
And on top of that, ICs weren't invented until 1958.
Hardware Quality (Score:2)
Now this may not be a problem for home users that buy a complete new system every two-three years (regardless of environmental effects), but I'm sure happy they don't sent out space probes which rely on today's state-of-art.
--
The most likely way for the world to be destroyed, most experts agree, is by accident. That's where we come in: we're computer professionals, we cause accidents -- Nathaniel Borenstein
Re:Hardware Quality (Score:2)
Usually when a CPU fails it's as a result of an accident, such as heat sink falling off or excess voltage being applied. I'm not saying it can't happen, I'm just saying you shouldn't be too quick to bash Intel for making unreliable chips.
As for space-faring hardware, it's custom built both to last and to resist radiation... surely we wouldn't use a P4 on a space probe, but it's more due to its huge power consumption than any inherent unreliability. I'm sure if NASA wanted to send out some P4s, Intel could very well provide suitable chips.
Note that I'm not at all Intel biased, I run a 1333 Tbird.
Re:Hardware Quality (Score:2)
No need to imagine. Suppose we round Moore's Law to a simplistic "double every year", which is about right. (Processors may not move that fast but remember it's the whole computer that affects processing time; add up processor advances, disk advances, memory advances, graphical advances etc. and you get probably more then a doubling per year, so this is conservative.)
I can start my 50-year computation on my P2000 (processor 2000, not Pentium 2000) in 2000 and be done in 2050.
Or I can wait a year and buy the P2001 and be done in 25 years, in 2026.
Or I can wait two years and buy the P2002, and be done in 12.5 years, in 2014.5.
Or I can wait three years and buy the P2003, and be done in 6.25 years, and finish in 2009.25.
Or I can wait four years and buy the P2004, and be done in 3.125 years, and finish in 2007.125.
If I wait five years and buy the P2005, I can be done in 1.0625 years, and finish in 2006.0625.
If I wait six years and buy the P2006, I can be done in
Because of the continuing exponential growth in power, the value of keeping a fifty-year-old processor online for fifty years is nearly zero once you get past the first few years. Note the P2050 finishes your P2000-50-year task in 50/(2^50) years, or
This isn't just theory, either; for some computations, it is more cost-effective to wait for better computers. The constants in the analysis of the first part of this message changes (usually an analyst would look at "spending $X" rather then "buying one computer"), but it works out the same. Sometimes you're better off waiting.
Now, for some people in some situations, practically, old computers can be useful. Don't extend my post past the context I've placed it in. I've got a happily cranking 233MHz P1 at home... but I don't do weather simulations on it for profit, I use it for some web scanning as a personal use in preference to throwing it out. (Even so, in ten years or so, it would be cheaper to turn it off and buy a lower-power-consumption computer...)
Lpt:1 would really be on fire! (Score:3, Funny)
Reminds me of some of the old linux kernel code, and thinking its good to have a sense of humor.
Trying to get a printer working and getting a kernel message saying Lpt:1 on fire!
It's dead and gone ( unfortunately ) (Score:4, Informative)
" Sadly, it's not an option to make CSIRAC operational again today. Time has taken a toll on this fragile dinosaur.
So what exactly would happen if anyone tried to relive the magic by switching it on?
"A lot of its components would not stand having voltages applied to them again," says Thorne. "I think it would probably catch fire."
Yes and no... (Score:2)
Not as slow as it sounds (Score:5, Funny)
Unclear (Score:5, Insightful)
" A half-century old computer, called CSIRAC, is still operating in Australia. The computer, which was Australia's first, ran at a blistering 300 kilohertz, had 2 KB RAM, and 2.5 KB storage."
But the Inquirer article [theinquirer.net] linked by the above Geek.com article says:
"The machine was the fourth computer to be built anywhere in the world, ran at 0.001MHz, and had a massive 2000 bytes of memory and a behemothic 2500 bytes of storage."
Which, by my calcuations, would be 1000 hertz or 1 kilohertz. I tend to believe the Inquirer, since they're running the source article. And besides, the 1977 Apple ][ was only 1 MHz, Don't you think there was a bit more progress than less than doubling in processor speed from 1949 to 1977?
500 hz initially, 1 khz later (Score:2)
Of course, by 1962 CSIRAC was years behind the state of the art.
Re:500 hz initially, 1 khz later (Score:5, Informative)
I sold a few programs for the beast on 2KB EPROMS. There can be quite much stuff in 2K. (for example an editor + assembler + disassembler). Once I added almost 500 bytes in a 2K program, and optimized it back into a 2K chip. Talk of ugly coding, used all the tricks I knew (reusing jump addresses for instructions, self-modifying code (written backwards in the rom to save a byte in copying it into ram), jumping into unrelated routines to reuse 4 bytes of the exit code, you name it. All done in pure hex... Man, those were the days...
Hmmm... (Score:2)
1000 hertz would be mindnumbing (Score:2, Funny)
"Which, by my calcuations, would be 1000 hertz or 1 kilohertz. "
With a clockspeed of 1000 hertz you'd actually be able to hear the thing go "OOOOOOOUUUUUOOOOUOOUUUUOOOOOUUUOUOUOOOOOO".
Man that !has! to have sucked completely to be a developer back then: "WATCHA SAYIN'?? I GOTTA WHAT??? CHANGE THE POINTER?? I !CAN'T! !HEAR! YA!!"
evolution (Score:2)
A question though, was it just built 50 yrs back, or has it had an up-time of 50yrs ?????
Re:evolution (Score:2)
though, back then(late 80s..), they sold 8086's, 286's and 386's at the same time, that's like if they sold pentium mmx's still for the home user and p4's for 8000$ to pro's..
we had pesky 8mhz 8086, my friend had 12mhz at 286, and another friends father had 386 on ibm tower. that was cool.
The computer is dismantled and stored... (Score:4, Informative)
Following the University of Melbourne's purchase in 1964 of a Control Data 3200 from the USA, CSIRAC was donated to the Museum of Victoria. At this time it was realised that CSIRAC was the oldest computer still in operation, and worthy of preservation so it was carefully dismantled and stored.
CSIRAC is now the centre-piece of the IT display at the Museum in Melbourne.
Re:The computer is dismantled and stored... (Score:2)
The article here [theinquirer.net] says that it's running.
The machine was the fourth computer to be built anywhere in the world, ran at 0.001MHz, and had a massive 2000 bytes of memory and a behemothic 2500 bytes of storage.
And it's still running, now safe in the Melbourne Museum, in Australia.
Maybe they too don't check facts before reporting.
But yes, they are different in the sense that they do have a spell checker
CSIRAC will never run again... (Score:5, Informative)
As I understand it, the music was recorded by building a replica of the sound hardware and connecting it to the emulator. People who heard the music have confirmed it sounds pretty much like the original in 1955 (IIRC, it was around that time).
Perhaps the coolest thing that they did with CSIRAC was build a HLL and compiler for it, which they called Autocoder IIRC. It looked like a cross between FORTRAN and BASIC and avoided some of the thinkos of FORTRAN, as far as I could tell.
CSIRAC is now permanently on display at the museum in Melbourne, Australia. It's the only complete, original machine of its generation in existence, and well worth a look if you come down our way. There is also a book on CSIRAC called "The Last of the first", which is a fascinating read if you can get your hands on a copy.
One of my university lecturers, Peter Thorne, got his start in computers as an operator for the machine. He met his wife there - she was a fellow computer operator!
Not still operational!! (Score:2, Informative)
That's the one (Score:2, Funny)
Knock Knock! (Score:3, Funny)
*60 second pause....*
CSIRAC!
Pioneers (Score:3, Funny)
Great old guy. His wife does a great pumpkin scone.
old computer still working... (Score:2)
www.aconit.org
Speed of Computer Evolution (Score:2)
If automobiles had evolved at the same rate as computers we would all be driving Jaguars that went 250 miles an hour, got 500 miles per gallon, cost $1000, and self-destructed once a year, killing all of the occupants.
Re: Self-destruction? (Score:2)
Re: Self-destruction? (Score:2)
> are separate issues.
Oh, you're right, I forgot that using Linux keeps your hard drives from failing, forces everybody to perform daily backups, keeps worms and viruses from affecting your system, makes CPU fans last forever...
50 years at 300KHz (Score:3, Informative)
50 years * 366 days/year (rounding up) * 24 hours/day * 60 minutes/hour * 60 seconds/minute * 300000 cycles/second = 4.74336e14 cycles
Now, my Athlon XP 1600:
4.47336e14 cycles / 1400000000 Hz / 60 sec / 60 min = Roughly 89 hours
So even if this machine were still running (which, incidentally, it's not. RTFA), in terms of pure cycles of functionality pulled out of the machine, my Athlon beat it in the first four days. It's a lot easier to maintain a pair of shoes than it is an airplane. And of course, this machine ISN'T still running, and would likely execute an HCF instruction (Halt and Catch Fire) if powered on, so you really can't call it reliable.
(Of course, my Athlon's running Windows (needed a games machine), so it's debatable whether or not these cycles have actually been functional...)
--AC
Re:50 years at 300KHz (Score:4, Funny)
So might your Athlon [tomshardware.com], son... So might your Athlon.
how do you replace vacuum tubes? (Score:4, Interesting)
Re:how do you replace vacuum tubes? (Score:2)
If you really want to be pedantic, you could point out that since a CRT is a vacuum tube, there are a LOT of tubes still being produced. Those won't help you repair a half-century old computer, though.
Vacuum tube logic (Score:2)
Don't make 'em like they used to (Score:2)
a) Heat and dissipation: They run hot as hell. Yeah, this was filled with Vacuum tubes and probably got fairly warm as wellone probably got fairly warm as well, but in modern PC's the heat tends to be focussed over particular components, leading to detioration over time.
b) Moving parts: Fast-spinning hard drives, fans (see heat, above), etc. The more moving parts you have the greater chance of failure. It also takes more power impulses to start a motor spinning up (hard drive, CD-ROM).
c) Expected time of usage: We're going through PC's a lot faster than we used to. How long was CSIRAC in use? For most home users, you can usually expect an upgrade at least every 5 years. Perhaps not a new PC, but at least a component. Why build a PC that's going to last forever if it's going to be obsolete very soon - except for consideration to servers, etc.
Re:Don't make 'em like they used to (Score:2)
I don't know if they've improved the specs lately, but IIRC 10 years ago (the last time I was designing hardware), the EROM chip makers didn't guarantee that they would hold data much longer than a decade.
Re:Don't make 'em like they used to (Score:2)
CMOS batteries don't last that long anyways though.
Reminds me.. (Score:2)
(You can look at it online [centennialbulb.org] if you want)
were it running (Score:2)
And all the while, there's a Sun machine thinking "Why can't you just short?! Short and be done with it!"
I'm having an episode!
Nonredudant (I hope) One Liners (Score:2)
There will be a celebration to jointly celebrate it's 50th anniversary and it's completion of calculating pi to the 4th digit.
This is so lazy (Score:2, Interesting)
Konrad Zuse (Score:2)
the article mentions IBM's digital computer in america,
but doesn't mention that the first digital computer (the 'ZI') was designed in germany by: KONRAD ZUSE:
Konrad Zuse - Mark I [epemag.com]
But does it sound better then a solid state comp.? (Score:2)
Oh Woohoo! (Score:2, Funny)
The trick... (Score:3, Insightful)
Some examples: DEC (Digital Equipment), in their heyday, came up with some great techniques for memory management at the hardware level. I'd be curious to know how many of those ideas got rolled over into more current stuff.
Another one; Where would we all be if Xerox's PARC facility had never come up with what has morphed into today's electronic rodent? Heck, IBM was using light pens years before that.
In short; You don't want to just ignore something because it's "old" or "obsolete" (Essence, I loathe that word!). You need to take the good ideas from the old stuff and build on them.
Somehow, I doubt that we would have so many tons of electronic junk choking landfills today if computer and electronics hardware was (a), really built to last, like the old stuff was; And (b), built to be easily upgradeable.
Unreal Tournament (Score:2)
CISRAC photo (Score:3, Informative)
Had that thing been running... (Score:3, Interesting)
50 years or a few hours? Which is better? (Score:2)
All in all, today's fastest Pentium could easily exceed the lifetime processing power of the CSIRAC in just a few hours, at a tiny fraction of the cost. Sure, it's cool that the computer still runs after 50 years, but let's put it into perspective here -- we get far more computing power out of modern chips, even if they fail within a couple years! Longevity isn't everything...
Weather prediction ? (Score:2, Informative)
Those tasks usually require large amounts of data to be processed
--
Stefan
DevCounter [berlios.de]
An open, free & independent developer pool
created to help developers find other developers, help,
testers and new project members.
Re:Weather prediction ? (Score:3, Insightful)
It seems laughabel to us now, but back then, it was an advancement. Ever onward and upward, such is the progress of computing.
Older chips are in fact used for reliable systems (Score:3, Informative)
It's not just that the simpler chips are more reliable, but they use less power, generate less heat, cost less and take up less space and don't weigh as much.
I have heard that the ARM chip is the most popular for embedded applications these days, and many of the ARM chips in use are quite tiny, have no cache and run in the 40 Mhz range, like the ARM7TDMI.
68000-based chips from Motorola are also very popular.
And check out uCLinux [uclinux.org], a linux port to several microprocessors that run without a memory management unit.
Why bother with an MMU when there's no disk to swap to, and the failure of a user program would mean the failure of the whole system?
Re:speccy (Score:2)
Re:speccy (Score:2)
Re:This is news?!?!? (Score:3, Insightful)
The article is clearly dated "Dec 10 2002" so it's not "from around a year ago" at all - no idea where you got that from.
Nick...
Re:300,000 Hz (Score:2)
Re:300,000 Hz (Score:2)
I doubt it. Digital computers and neural nets (brain) are just good at different things. Digital computers are good with things such as math, exact things. Brains are excellent at fuzzy logic. That's why it's so easy for you to read this comment. Your brain would say "That looks pretty close to a "T" so it's a T." A digital computer actually needs to run a neural net simulation for OCR, which takes quite a bit of power.
On the other side of things, try to compete with this 50 year old computer on algebraic formulas. You'd lose. Computers are just better at that.