Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

Intel 4004 Turns 30 214

fm6 writes: "Just the thing to remind an aging geek of his mortality: this week marks the 30th anniversary of the Intel 4004, the very first microprocessor. Another historical page here, and a column bemoaning the absence of dancing in the streets here. Trivia -- why 4004? Because it was the fourth component in a 4-bit chipset." You might want to read the interview with Ted Hoff from a few months ago, it's pretty informative about the origins of the 4004.
This discussion has been archived. No new comments can be posted.

Intel 4004 Turns 30

Comments Filter:
  • I thought... (Score:2, Flamebait)

    by scaramush ( 472955 )
    The best way to celebrate would be the release of a new athlon ;)
  • by ackthpt ( 218170 ) on Tuesday November 13, 2001 @12:54PM (#2559256) Homepage Journal
    Is it still in production anywhere and what's the current record for overclocking one of these babies?
    • The Intel 4004 executes about 100,000 instructions per second

      You can probably overclock to 101,000 i/s or so. But watch the heat! You can slighty warm the surrounding air with one of these monsters.
    • I don't think they've been in production for a long time. Go punch in "intel 4004" at ebay.
      • Actually, you can get them [ebay.com]...they're marketed as collector's items, tho :)
      • Should be easily synthesizable though. Opencores doesn't have one, but if someone can synthesize an ARM core, surely someone somewhere has synthesized the 4004.

        I did a quick search, but it seems as if every VHDL engineer in the country has a phone number ending in 4004, so the search results weren't too easily navigatable :)

        • Should be easily synthesizable though. Opencores doesn't have one, but if someone can synthesize an ARM core, surely someone somewhere has synthesized the 4004.

          Pardon me for asking the obvious question, but, err - why would you want to? Is there some killer app for the 4004 that I'm not aware of? Somebody did a four-bit version of Wumpus or something?
          • why would you want to?

            CS and other hobbyists get all sniffly for the good old days. I was having a huge amount of fun a couple weeks ago hacking a 6502. That we even recall such an occasion should suggest to you that so long as some of us are alive, remembering and playing around with such artifacts defines who we are.

          • Because:

            1... Why not? People are synthesizing other CPUs that aren't really 'useful' in these days.

            2... Because you can!

            3... Because its part of our history, and keeping these things alive is part of our duty to preserve the history of computer science, even if a synthesized core is only the chip in question from an external point of view, it still preserves the memory.a well written VHDL specification should document how the chip works for anyone in the future to examine.

          • One might as well ask why anyone would bother building a working Babbage Difference Engine, yet it has been done (and the sucker works too!). I suppose the rationale for building a trebucet is slightly more obvious (throwing things a long distance).

            Hack value has its own sweet savor.
          • Actually, the killer app in question was a Japanese calculator, programmable I believe.

            /Brian
    • Try the emulator! (Score:4, Interesting)

      by VDM ( 231643 ) on Tuesday November 13, 2001 @01:07PM (#2559346) Homepage
      Perhaps no overclocking and Linux, but -vice versa- there exists a 4004 software emulator for Linux (e.g., i4004em [cnlab-switch.ch]).
  • Hey! (Score:3, Funny)

    by 7608 ( 515533 ) on Tuesday November 13, 2001 @12:55PM (#2559267)
    Unless you can port linux to this, why do we care? This is slashdot, we have standards!

    • SHHHHH

      Don't get those embedded linux project guys' thinking, you never know how many months of StrongARM development could be lost to the 4004.
  • Perspective (Score:1, Redundant)

    by inc0gnito ( 443709 )
    Personally, I wasn't around back then. But looking at Intel's website looking at the predecessors to today's microprocessors really puts the accomplishments of the last thirty years into perceptive. It's staggering how far we've come in such a short period of time.
    • by ackthpt ( 218170 ) on Tuesday November 13, 2001 @01:05PM (#2559326) Homepage Journal
      I was around back then. My father even tried to get me interested in hobby computing, as opposed to my high-voltage experiments with transistors, capacitors, resistors and other things which could explode and poke an eye out. Eventually I got access to a DEC PDP system while in Explorers (at Dow Chemical in Midland, MI, no less) and once I discovered big, huge, high current processors (all TTL logic *8^) you could fry an egg on, I've never looked back. (much like today's P4 and Athlon, hey Thanksgiving baking tip, toss a turkey in one of these machines and cook it in half the time!)

      Maybe some day, when I get tired of making small electronic curcuits explode, I'll get one of these and build an SAP (simple as posssible) computer out of one, just for jollies, assuming I still have eyes left.

  • Uh oh... (Score:2, Funny)

    by Anonymous Coward
    ...pretty soon, it will start experiencing its midlife crisis.
    • Re:Uh oh... (Score:3, Funny)

      by j7953 ( 457666 )
      ...pretty soon, it will start experiencing its midlife crisis.

      At Intel, they call it the Aging Microprocessor Depression.

  • by Anonymous Coward
    Doesn't the modern Intel Pentium 4 run just about as fast as this one too?

    Haha.

  • As I'm only almost 18, I've only been around for a little less than two thirds of its lifespan, but I'm impressed by the progress as well. I still remember playing Hard Hat Harry on the old Apple II/e in 4th grade and Where in Space is Carmen Sandiego in 5th. Now, I'm a hard-core Rogue Spear and Counterstrike player, so I appreciate everything that Intel has done for us. Here's to thirty more years, Intel.
    • I loved HHH and M.U.L.E and Archon and all the old 8-bit EA games.

      Although Civ III on a P3-900 is pretty kick ass.
    • You probably already know this, but Apple employed 6502's, which was made at the time by (correct me if I'm wrong) MOStek.

      I recall the 6502 instruction set as being mingy when compared to the Z80 (for instance); I regarded that as a handicap at the time (this was 1981 or so) but it was probably a good thing for a beginning assembly hobbyist to cut his teeth on.
      • by Monte ( 48723 )
        I recall the 6502 instruction set as being mingy when compared to the Z80 (for instance); I regarded that as a handicap at the time (this was 1981 or so) but it was probably a good thing for a beginning assembly hobbyist to cut his teeth on.

        This is true. The 6502 was a competitor to Intel's 8080. The latter was much more powerful (and the later Z80 by Zilog was like an 8080 on steroids), but the former was cheaper... IIRC something like $400 for the 8080 and $250 for the 6502 (and these are mid-70s dollars). Also the support circuitry for the 6502 was simpler, further reducing cost.

        Pardon if I got some facts wrong, age and alcohol have taken a good number of brain cells :)
        • The 6502's initial excitement was the price -- it was the first real $20 microprocessor. The 8080 cost quite a bit more in 1976. The 6502 was, I would suggest, a far more elegant chip than the 8080. The 6502 was Federico Faggin's second microprocessor. He did the Motorola 6800, then jumped ship to MOS Technology. The 6800 was loosely inspired by the PDP-11, a very nice machine for the day. The 6502 rethought a few of the 6800's design decisions and was even nicer. Apple picked the right chip.

          BTW the original 6501 was pin compatible with the 6800, but didn't ship because Moto made some threats. The 6502, I think, added an internal clock; there were a bunch of other 65xx chips that followed. Rockwell was the second source and kept the line going after MOS Tech, which had been bought by Commodore, tanked. I think it's still fairly popular in the embedded world.

          The 8080's instruction set was more reminiscent of the PDP-8, and rather ugly. The 4004 was not really a microprocessor, just a controller, because it lacked interrupts and some other features needed to be a "computer". The 8008 was a major improvement. And only 18 pins -- a lot of jelly beans needed to run it! The 8085 was the easier one to glue to. I recently read an article explaining the history of those early Intels, but I don't recall where.
      • Actually, MOS Technology, and how the lawyers allowed those two companies to co-exist in the same market I'll never know.

        I liked the 6502 instructions. There was an elegance to them, a symmetry that seemed missing in the 8080 and its ilk. When in doubt, Intel threw registers and special instructions at the problem (never mind the Z80's two complete register banks), whereas MOS seemed to favour soft solutions (don't need no stinking "multiply"), and of course memory-mapped I/O.

  • by shaunak ( 304231 ) <(shaunak) (at) (gmx.net)> on Tuesday November 13, 2001 @01:07PM (#2559344) Homepage
    "this week marks the 30th anniversary of the Intel 4004, the very first microprocessor. "

    What?
    I thought Microsoft made the first microprocessor after purchasing the idea from Al Gore.
    But, well, if they say so on Slashdot, it MUST be true.
  • Every day is the 5nth anniversary of .055% of everything that ever happened, and as a rule we celebrate very little of it, or it would occupy all our time.

    Do we really think the 4004 might be offended by the oversight, or that microprocessors in general aren't getting enough attention in the press? I think the computer industry as a whole could be modded down a point as it is.
  • The 8080 (Score:5, Interesting)

    by Reality Master 101 ( 179095 ) <`moc.liamg' `ta' `101retsaMytilaeR'> on Tuesday November 13, 2001 @01:11PM (#2559364) Homepage Journal

    The 4004 was certainly a significant milestone, but I think the 8080 launched in 1974 was truly the "Model T" of the computer industry. That was the chip that was general enough to really run everything. It was the basis for all the microcomputers and the CP/M operating system.

    In fact, I believe Zilog Z80s (an 8080 clone with some extra instructions -- around 1977?) are still being manufactured as controllers in various products.

    • Re:The 8080 (Score:5, Informative)

      by Reality Master 101 ( 179095 ) <`moc.liamg' `ta' `101retsaMytilaeR'> on Tuesday November 13, 2001 @01:23PM (#2559440) Homepage Journal

      In fact, here's Zilog's page on the Z80 [zilog.com] still in production after 25 years! How many other computer technologies do you know that are still available after 25 years? Pretty remarkable.

      Talk about a company milking something for all its worth! :)

      • If you write assembly programs for the TI-83 graphing calculator, you're writing in Z80 assembly. I don't know if they actually use a Z80 chip, or if they're using a clone or a faster-but-similar chip.
      • I agree, the Z80 is an important bit of technology. Is there any CPU that has gained such wide acceptance or been in continuous production for so long?

        Still, the Z80 now is nothing to what it might have been. I remember when Z80/CPM systems were the standard for serious desktop computing. Even Apple II people used a Z80 card to run business apps. (There was a MS version -- their first hardware product!) I think this Apple/Zilog combo was the most common desktop business computer at one time. If things had gone just a little differently...

        • It's worth pointing out that Zilog came out with a Z8000, which was a 16 bit version of the Z80, but it was a massive failure. The other competitor, of course, was the 68000 (which Apple chose). The rest of the industry (read: IBM) picked the 8086. I don't think the Z8000 was ever really in the running. I don't know much about it, so I'm not sure why.

          Trivia: I actually saw an early prototype of an IBM PC that used a 68000 microprocessor. Really! We were contracted to port some products to it (our shop had a bunch of 68000 products), but the contract never came through for obvious reasons.

          • Having had the pleasure of programming the Z8K, it was not a 16 bit version of the Z80.

            The Z8000 was available in four flavors:

            Z8001 Segmented (8MB address space)

            Z8002 Non-Segmented (64KB address space)

            Z8003 Segmented (8MB address space), Virtual Memory Support

            Z8004 Non-Segmented (8MB address space) Virtual Memory Support

            The Segmented CPUs had a flag bit that allowed them to run in non-segmented mode.

            The Z8000 was much closer architecturally to the 68K family than the Z80/x86 family. It had 16 orthagonal, 16-bit registers (R0-R15), which could be paired up as 8 32-bit registers (RR0-RR14). R15 (non-segmented mode) or RR14 (segmented mode) was the stack pointer.

            The opcode names were similar to the Z80, but the architecture was vastly different. The Z8000 series was popular in embedded and military applications. Unfortunately, I don't believe Zilog ever built the Z8070 FPU for the processor, which also hindered it's acceptance as a mainstream CPU.

            Anyone out there remember the Zilog ZEUS System 8000? It was a Unix System III variant.

            • Anyone out there remember the Zilog ZEUS System 8000? It was a Unix System III variant.
              Was working for Zilog at the time. ZEUS (Zilog Extended Unix System) came out before System III was officially released. Pretty sure it was based on V7, not System III.

              At the time I thought this was Big Stuff. Unix on a microcomputer! But in hindsight, you really need a VM to have a serious modern OS. That's why there's no Linux for the 80286! I think the first commercial Unix to do this was CTIX, which Convergent Technologies created (System III/V, with the BSD VM, running on a 68010 with proprietary memory management hardware) for its MegaFrame box. The MegaFrame was a disaster (tried to be too many things at once), but it paved the way for the first 680x0 Unix boxes -- Convergent's main claim to fame before they were absorbed by Unisys.

              • ZEUS (Zilog Extended Unix System) came out before System III was officially released. Pretty sure it was based on V7, not System III.

                I'm pretty sure that uname indicated SysIII. Of course, that was ZEUS 3.1 and 3.2. YMMV for earlier versions.

                Was there really any difference between V7 and SysIII?
                • That's a very good question. On the one hand, they were quite far apart on the Unix family tree [wanadoo.fr]. On the other hand, I don't recall anybody abandoning their V7 ports and starting over with System III, when AT&T decided that the latter was the official commercial Unix. Probably everybody just folded in System III features. I'm guessing that V7 backward compatibility was never an issue, though I'm hardly the expert.
          • Actually, I was working for Zilog at the time. I wasn't working on the component side (I was hired to document their Z8000-based Unix box), but I seem to recall that the Z8000 was a substantial change in architecture from the Z80. There was a school of thought that said that 16-bit processors should not try to be backward compatible with 8-bit processors. Thus Motorola chose to design the 68000 from scratch, rather than upgrading the 6800. I'm pretty sure Zilog made a similar decision.

            By contrast, Intel saw the 8086 as a continuation of the 8080. So they came up with this segmented architecture that looked much the same as an 8080 at the 64K address space level. (The binaries were different, but you can port an 8080 assembly program simply by resassembling it. Which is why Microsoft Basic was so slow on DOS!) Bad from an engineering point of view (we still have issues with memory models on Windows!) but that sort of cautious technology probably had a lot to do with Intel beating out Motorola to provide the PC's CPU.

    • Re:The 8080 (Score:3, Insightful)

      by N2UX ( 237223 )
      In my opinion, the DEC PDP-8 [pdp8.org] was the "Model-T" of the computer industry. It was essentially the first computer you could actually buy, instead of having to lease. At an entry level cost of around US$ 18,000 it was easily affordable by most businesses and universities who needed a computer. There were enough of them made that a lot of third parties developed add-ons. Also, there are still a few PDP-8's running production applications as controllers for manufacturing machinery.
      • Well, I was limiting myself to microprocessors, but if you want to refer to general purpose computers, it was probably the IBM 360 that really brought computers into mass production as general purpose devices.

    • Re:The 8080 (Score:2, Interesting)

      by DdJ ( 10790 )
      In fact, I believe Zilog Z80s (an 8080 clone with some extra instructions -- around 1977?) are still being manufactured as controllers in various products.

      Heck, it's not just used for controllers: a Z80 clone is at the heart of every Gameboy and Gameboy Color (not Gameboy Advance).
    • Comment removed based on user account deletion
      • The Zilog had a much larger instruction set and many of the shared instructions executed in fewer CPU cycles.

        Well, "much larger" is a bit of an exaggeration, although there's no doubt they added a bunch of useful instructions. What really rocked were the extra registers. They really came in handy.

        The 8-bit processor that really gets no respect is the Motorola 6809. Lots of registers, and the instructions set was orthogonal with respect to them. It was great, but not really widely used for some odd reason.

        Not much has changed, has it? :)

    • The 8080 was the first 8-bit microprocessor, wasn't it?

      I'm still a 65xx fan. The 6502 was a really nice little chip for the time; it actually managed to survive in ordinary use (in the 6510 format) until the late '80s.

      Imagine a chip design being good enough to last ten years now.

  • That was back when they said we only need 16 bytes of memory, who will need any more...

    (yeah I realize.....it's a joke.)
  • by micromoog ( 206608 ) on Tuesday November 13, 2001 @01:12PM (#2559377)
    From Intel's site [intel.com]:

    The 8008 was twice as powerful as the 4004.

    If only naming conventions could make that much sense today . . .

    • The 8008 was twice as powerful as the 4004.

      If only naming conventions could make that much sense today . . .

      "Today on Tom's Hardware: a review of the brand new Intel 4,198,498,304 processor."

      I'll stick with the Athlon model numbers, thank you.

  • Can anyone name the first true single-chip microprocessor? It has to have integrated RAM, ROM and I/O.
    • I thought having integrated RAM, ROM, I/O made it a microcontroller, not a microprocessor

      • It would be both a microprocessor and a microcontroller. Microprocessor is the generic term, microcontroller is the more specialized term.

        The earliest microcontroller I've found a reference to is the TMS1000 (Texas Instruments), mid 1970's.

    • That's called microcontroller, and it probably was the Intel 8051.
    • I can think of several candidates for the first single-chip microcontroller. The very first one in commercial production was probably the Texas Instruments TMS1000, although I think they may have had an even earlier version (TMS0970, perhaps?). These were four-bit microcontrollers.

      The first 8-bit single-chip microcontroller may have been the Intel 8048, introduced in 1976. It had masked ROM; there was also an EPROM version, the 8748, and a ROMless version (for external program memory), the 8035.

      Another possibility for the first 8-bit microcontroller may be the Mostek MK3870, which was a single-chip version of the Fairchild F8 processor family.

  • 4004 Memories (Score:4, Interesting)

    by Rick Richardson ( 87058 ) on Tuesday November 13, 2001 @01:19PM (#2559413) Homepage
    I remember stopping by the Intel booth at the National Computer
    Convention in New York in 1971-1973 timeframe (can't remember exact
    date).

    My Dad had put me on a train to New York to expand my teenage
    horizons. I returned with 4004 and 8008 data sheets and some chip
    samples. I spent the next few months dreaming up what I was going to
    do with the chips and drawing schematics.

    I never did build anything with them, because owning a terminal and a
    modem was more important to me at that time than a having a uP - if I
    had had my priorities straight, I might be famous now [grin]. I did end up
    designing and building 3 different video terminals, though.

    Thanks for the memories.

    -Rick
    • "My Dad had put me on a train to New York to expand my teenage horizons. I returned with 4004 and 8008 data sheets and some chip
      samples."

      It sounds like he sent you there "to become a man," but you came back a geek instead :-)
  • I had an Odyssey video console for my old B&W TV back in the day. It had a membrane style keyboard, which you could program assembly code in, and it ran on the 4004.
  • by ch-chuck ( 9622 ) on Tuesday November 13, 2001 @01:22PM (#2559433) Homepage
    about the 4004 development, right here [ieee.org] - they were Intel's customer at the time.
    • Best thing I've found on /. in a while.

      Of course, it is best balanced with Mr. Hoff's interview, as they seem to have different ideas on how much everyone contributed, the language and technical communication barriers were definately there.

  • And I though that... (Score:2, Interesting)

    by staili ( 200478 )
    ... the first microprocessor was older than UNIX [slashdot.org]

    It seems a bit strange for me to think that first unix didn't run on a machine with microprosessor.
    • Lots of operating systems ran on non-microprocessor machines. In fact, the first machines running Unix were roughly as powerful as a 68000 processor, like that used in early Macs and many others (including my favourite Motorola-powered computer, the Sinclair QL).

      Motorola actually produced a "VAX-on-a-chip" for Digital that was basically a re-microcoded 68000.

    • In those days you couldn't put enough transitors
      on a chip to make a descent CPU core for a
      high end system, so what people did was join
      a lot of bit-slices together. A bit-slice was
      a 1,2 or 4 bit segment of an ALU +with instruction
      decoders, which could be chained together with
      carry bits outputs and inputs to make a 16 or 32 or whatever bit CPU core.
  • I always prefer the latest model.
  • by SuzanneA ( 526699 ) on Tuesday November 13, 2001 @01:30PM (#2559477)
    Are we celebrating the wrong date?

    According to The Chronology of Personal Computers (1969-1971) [islandnet.com]:

    The first production run of the 4004 was in December 1970. Admittably the production run had to be tossed due to mask errors, but 2nd and 3rd production runs in Jan and Feb of 71 were more sucessful (the 2nd run still had errors). Sample calculator designs were shipped to Busicom in March 71 - comprising 4 4001s, 2 4002s, 2 4003s and 1 4001.

    The only relevance of November 71 that I can find, was that the MCS-4 microcomputer based on the 400x series was released. But thats not the microprocessor itself.

    One thing that stands out, is that Intel have had production problems and bugs since day 1 :)

  • Does anyone know if there are any computer museums with these on display? I haven't hit the computer museum in Boston for a while, I'ld go to see this.
  • According to this site [blinkenlights.com], the first personal computer was Simon, c. 1950, a relay and paper-tape affair. You can argue with their definitions, but it has a lot of interesting historical machines.

    MITS Altair [digitalcentury.com] really started the PC revolution, in that it was readily available, had a decent amount of compute power, and was affordable.
  • by Eric Smith ( 4379 ) on Tuesday November 13, 2001 @01:39PM (#2559529) Homepage Journal
    Actually, the CADC [microcomputerhistory.com] was the first microprocessor. It was used in the F-14A.

    It is lesser known because the designer, Ray Holt, only received clearance to publish information about it in 1998.

  • 4004 Family Tree (Score:3, Interesting)

    by spiro_killglance ( 121572 ) on Tuesday November 13, 2001 @01:47PM (#2559586) Homepage
    I'm sure i'll miss a few but here goes,

    Sorry the formatting is poor due to the lameness filter.

    4004
    4040
    8008 8080 Z80 (Zilog) Z8000 (16-bit)
    8086 8085 Z800 (Z80 extension)
    80186
    80286
    386SX also IA468 (still born new archi)
    386DX
    486SX
    486DX
    486DX-2
    486DX-4
    Pentium, AMD K5, 586 (cyrix)
    P-MMX P-PRO K6 686 Win chip
    P-2 Celeron K6-2 686MX Win Chip II
    P-III Cel(2) K6-3 ?
    Coppermine Athlon Cyrix III
    T-bird
    P4 Tualatin Athlon XP

    I've missed out the Xeons, and of course all the
    microprocessors that didn't have some lineage
    to the orignal 4004. Although the instruction
    sets changed a lot particular from the 4004 to
    8080 and from the 8080 to 8086, there is enough
    similarity in there style and content to claim
    that your Pentium 4 or Athlon XP is directly
    descended from the 4004. It makes you wonder
    if Intel can really expect to shift people from
    the x86 arch to a totally new one.
  • by MsWillow ( 17812 ) on Tuesday November 13, 2001 @01:48PM (#2559599) Homepage Journal
    Still utterly unused, in anti-static foam, three Intel 4004s. My roommate decided to start collecting old CPUs, and I managed to find these, free. I still want to make a very simple blinking-lights toy with one of these, and proudly put the "Intel Inside" sticker on the box :)

    Goddess, this brings back memories! Hanging out at the library, using their terminal to call (at 300 baud, that was *fast!*) the HP-2000 system at Harper College, and chatting with friends who had serious money (Jeff actually *built* an Imsai 8080 unit, though he got many of the parts free by schmoozing the sales person).

    30 years, gads. Back then, having even a floppy disk was a wild dream, now we have 100+ gigabyte hard disks. Back then, having one whole K of ram was heaven - last week, I bought 512 meg for $20. Back then, the clock oscillator could be made from a simple L-C circuit, and it ran several hundred kilohertz. Now, it's a PLL-controlled internal oscillator, using an external crystal oscillator, all running at frequencies that make a microwave oven look slow.

    All this, in thirty years. That *really* makes me feel old :)
    • Yep.. The data sheet crows about how the 4004 can handle what would now be called 4K of ROM and 640 bytes of RAM (though it was measured in bits back then).
      and it could add two 8 digit integers in just under a second (probably stored in 32 bits as Binary Coded Decimal)

      And I moan about how my secondary computer has 'only' 96meg of RAM in it....

    • And just think... Harper College still uses an HP-UX system that's not much more powerful than a 4004 :-P
  • The old days (Score:2, Interesting)

    by certsoft ( 442059 )
    I did some programming for the 4004 back around 1975, I had no idea it was that old of a chip. Of course I was working for the military, so that might explain it :)

    As I recall we had a Model-33 Teletype for software development. We punched the program into paper tape, called up a system using an acoustic modem and used their cross-assembler. Or maybe I'm just having an antacid flashback.

    • As I recall we had a Model-33 Teletype for software development. We punched the program into paper tape, called up a system using an acoustic modem and used their cross-assembler. Or maybe I'm just having an antacid flashback.
      No, I was doing the same thing during the same period. Disk storage was expensive and video display even more so. The KSR-33 Teletype provided cheap substitutes (paper tape and continuos-feed printer) for both. The damn thing was everywhere, and its design influenced the keyboards we still use. Oh yeah, and that's why Unix/Linux serial interfaces are still prefixed with /dev/tty.

      I worked for a company once that tried to do away with the ESC key -- a key which doesn't make sense except on a teletype. Everybody made fun of us for it.

      The KSR-33 was such an important part of computer culture that William Gibson put them in his classic cyperpunk novel, Neuromancer. Ironically, the Teletype Corp discontinued production of this model in 1984, the same year Neuromancer was published.

  • Wouldn't it be more appropriate to make a bigger deal when this thing is four thousand four years old? Or perhaps, when it turns 44?

    I've been waiting for the year 2015 to be the first poster with the story. I was really looking forward to all the extra karma gained with the mod ups.
    Dang, foiled again.
  • by compugeek007 ( 464717 ) on Tuesday November 13, 2001 @02:27PM (#2559738)
    Here is a link [udel.edu] that has a simple graph from the 4004 to the P7 (Merced Pentium II) that shows how Intel has obeyed Moore's law (at least until the P2.)
  • by Darth RadaR ( 221648 ) on Tuesday November 13, 2001 @02:29PM (#2559751) Journal
    Intel 4004 Turns 30
    from the middle-aged dept.


    I'm 33 and that ain't middle-aged. I take offence! :P
    • Not to worry my son,

      According to my calculations, it's not 30. It's 29.99999... Did I mention I'm running a Pentium 75?

    • Wow, 33? I think we've just found the oldest person on Slashdot. I mean, 33 years ago. . . that's like, the sixties, right? How did kids ever grow up without the Internet?

      If we take Moore's law and extrapolate. . . 33 years is 396 months, so technology must be 2^(396/18) = 4,194,304 times more advanced now than it was then. Did you guys even have fire yet, or were you still confined to nice warm Africa?

      I think we need to call up Guiness (aside: isn't it strange that a record-tracking group also makes beer?) and update the records. This is a major archealogical find.

  • On speed. (Score:3, Interesting)

    by tit4tat ( 255420 ) on Tuesday November 13, 2001 @02:55PM (#2559912)
    Intel summed up the "speed" [sic] difference between a 4004 and a Pentium 4 with an interesting assume-this-basketball-represents-the-sun-like analogy: "Intel's first microprocessor, the 4004, ran at 108 kilohertz (108,000 hertz), compared to the Pentium® 4 processor's initial speed of 1.5 gigahertz (1.5 billion hertz). If automobile speed had increased similarly over the same period, you could now drive from San Francisco to New York in about 13 seconds."
    • by ch-chuck ( 9622 )
      If automobile speed had increased similarly over the same period, you could now drive from San Francisco to New York in about 13 seconds."

      but if you were running Winblows it would have crashed in Utah...
  • by Caractacus Potts ( 74726 ) on Tuesday November 13, 2001 @03:16PM (#2560007)
    There's a Dutch auction over at eBay with 100 of these little babies at $10 a piece. Just search for "Intel 4004". The date code on the picture shows that they were made in 1975, so they're not the ceramic and gold ones. Auction ends Thursday evening. Don't outbid me, or I'll mod you down.
  • by billcopc ( 196330 ) <vrillco@yahoo.com> on Tuesday November 13, 2001 @03:21PM (#2560035) Homepage
    But imagine a beowulf cl...

    Hey, seriously, wouldn't it be groovalicious to have a bunch of 4004's produced using today's .13 micron technology, making them a tiny fraction of their original size ? Throw down a hundred of them on a board and have it run a massively parallelized app of some sort at 25 cents per node.

    Why the hell not ?
    • That's an interesting thought, and it's not the most ridiculous idea I've ever heard...

      The reason it makes sense is the same reason USB is in the process of displacing IEEE 1284 (and the reason RDRAM probably seemed like a good idea at the time) -- in theory parallel is faster, but the simple fact is that it's substantially more of a pain in the ass than serial to get working. The basic idea is to put several gazillion 4004 cores on one chip (perhaps something of a PGA-like device) and just bite off as much processor bandwidth as you need. I think the operative concept here is "dynamic pipelining" -- does that make sense to the sandbenders around here?

      /brian
  • You might want to read the interview with Ted Hoff
    This interview credits Hoff with inventing the microprocessor. I think crediting any single person with "inventing" this device is a silly oversimplification.

    But Hoff did accomplish something important -- a lot more important than inventing a particular gadget. He demonstrated that simple general purpose computers could be built that could replace a lot of the complex custom hardware that was then being built. In so doing, Hoff started us down the road to making computers ubiquitous.

  • hacking the 4004 (Score:4, Informative)

    by trb ( 8509 ) on Tuesday November 13, 2001 @11:25PM (#2561668)
    I hacked the 4004 when I was a student, back around 1977. My work-study mentor had a contract with Monsanto, working on the machines that made the very first production plastic Coke bottles. These bottles were heavy duty, like little dark green wiffle-ball bats. The bottles were taken off the market pretty quickly, because of some problem with the plastic they were made from. I still have one.

    Anyway, a conveyor belt dropped bottles from a wheel going around (a horizontal disc) onto straight rows of pins, also moving. Required some trigonometry and timing, especially when starting the machines up. It was controlled by a 4004, the code lived in 7 256-byte uv eprom dip chips.

    We had an assembler written in Fortran, it ran on either a Honeywell 1648 or a Dec PDP-10 (both notable machines in ARPANET/Internet history). When I got there, they used to type the hex assembler output into the prom burner by hand! Burning the 7 proms took 18 hours of person time, and was error-prone. I wrote some code to do the eprom download automatically, with a paper tape or something, cut the process down to an hour and a half, made some folks pretty happy.

"Conversion, fastidious Goddess, loves blood better than brick, and feasts most subtly on the human will." -- Virginia Woolf, "Mrs. Dalloway"

Working...