Forgot your password?
Technology Hardware

Microchips That Shook the World 185

Posted by Soulskill
from the seismo-eighty-six dept.
wjousts writes "IEEE Spectrum has an interesting article on '25 Microchips That Shook the World,' including such classics as the Signetics NE555 Timer, MOS Technology 6502 Microprocessor (Apple II, Commodore PET and the brain of Bender) and the Intel 8088 Microprocessor. Quoting: 'Among the many great chips that have emerged from fabs during the half-century reign of the integrated circuit, a small group stands out. Their designs proved so cutting-edge, so out of the box, so ahead of their time, that we are left groping for more technology clichés to describe them. Suffice it to say that they gave us the technology that made our brief, otherwise tedious existence in this universe worth living.'"
This discussion has been archived. No new comments can be posted.

Microchips That Shook the World

Comments Filter:
  • by Anonymous Coward on Friday May 01, 2009 @07:24PM (#27794049)

    PRINT ARTICLE (instead of the 5 separate pages): []

    The 25:
    1 - Signetics NE555 Timer (1971)
    2 - Texas Instruments TMC0281 Speech Synthesizer (1978)
    3 - MOS Technology 6502 Microprocessor (1975)
    4 - Texas Instruments TMS32010 Digital Signal Processor (1983)
    5 - Microchip Technology PIC 16C84 Microcontroller (1993)
    6 - Fairchild Semiconductor A741 Op-Amp (1968)
    7 - Intersil ICL8038 Waveform Generator (circa 1983*)
    8 - Western Digital WD1402A UART (1971)
    9 - Acorn Computers ARM1 Processor (1985)
    10 - Kodak KAF-1300 Image Sensor (1986)
    11 - IBM Deep Blue 2 Chess Chip (1997)
    12 - Transmeta Corp. Crusoe Processor (2000)
    13 - Texas Instruments Digital Micromirror Device (1987)
    14 - Intel 8088 Microprocessor (1979)
    15 - Micronas Semiconductor MAS3507 MP3 Decoder (1997)
    16 - Mostek MK4096 4-Kilobit DRAM (1973)
    17 - Xilinx XC2064 FPGA (1985)
    18 - Zilog Z80 Microprocessor (1976)
    19 - Sun Microsystems SPARC Processor (1987)
    20 - Tripath Technology TA2020 AudioAmplifier (1998)
    21 - Amati Communications Overture ADSL Chip Set (1994)
    22 - Motorola MC68000 Microprocessor (1979)
    23 - Chips & Technologies AT Chip Set (1985)
    24 - Computer Cowboys Sh-Boom Processor (1988)
    25 - Toshiba NAND Flash Memory (1989)

    ( mod me up so some karmawhore will find themselves FAIL'd )

    • Re: (Score:3, Insightful)

      by Jeff DeMaagd (2015)

      Five pages really isn't bad though, there's a lot of reading per page, whereas a typical site might have one page or more for the explanation as to why each chip was considered significant.

      Also, just listing the "winners" doesn't do justice to the article.

      • Re: (Score:3, Informative)

        by Quothz (683368)

        Also, just listing the "winners" doesn't do justice to the article.

        On the Internet, we have what are called "hyperlinks". They're typically colored differently from other text and underlined; when your mouse cursor passes over them, it will generally change. If, while your cursor is over the hyperlink, you press the left mouse button (called "clicking"), your browser will load a different page, to which the hyperlink points.

        A good example of this can be found in your parent's post, near the top. That poster included a colon after the first line of his or her post; this i

    • by Anonymous Coward on Friday May 01, 2009 @08:05PM (#27794337)

      Among the many great chips that have emerged from fabs during the half-century reign of the integrated circuit...Intel's 8088

      Wrong. The 8088 was a technical nightmare with a crappy architecture . It just got lucky. IBM's justifiable preference was Motorola's infinitely superior 68000. Unfortunately, the 68000 was 9 months to a year away form production and the 8088 was in production 'now'. IBM felt that it had do it 'now' or miss the market window, so they (reluctantly) went with the 8088. A combination of perfect timing, luck, great marketing form IBM and Intel then and superb marketing strategy from Intel (the best selling sow's ear ever) sealed its place in history as a marketing success, but by no means a technical marvel.

      • by x2A (858210) on Friday May 01, 2009 @09:37PM (#27794931)

        "The 8088 was a technical nightmare with a crappy architecture . It just got lucky. IBM's justifiable preference was Motorola's infinitely superior 68000. Unfortunately, the 68000 was 9 months to a year away"

        Yeah, I hear ya, the architecture of a chip is much more important than whether it exists or not.

      • by Thomasje (709120) on Saturday May 02, 2009 @12:04AM (#27795709)

        The 8088 was a technical nightmare with a crappy architecture . It just got lucky. IBM's justifiable preference was Motorola's infinitely superior 68000. Unfortunately, the 68000 was 9 months to a year away form production and the 8088 was in production 'now'. IBM felt that it had do it 'now' or miss the market window, so they (reluctantly) went with the 8088.

        The 8088 was a big step forward compared to the 8080, 8085, and Z80, which were the dominant CPUs for "personal computers" in the late '70s and early '80s. The 8088 could address one megabyte of memory without needing any external bank-switching hardware, and it had 16-bit registers throughout, and it could run at higher clocks than the aforementioned 8-bit CPUs of the time. Compared to the 64 kilobyte address space of the 8080/8085/Z80 and the 6502, this was a big improvement, and, as lame as it may sound today, a CPU with 16-bit registers and a 4.77 MHz clock was pretty fast compared to what existed in personal computers at the time.

        The 8088 really was a significant improvement. Yes, the 68000 was better, but it wasn't available in quantity yet, but perhaps even more importantly, choosing the x86 for the PC meant that software like WordStar and DBase and others, which was written in 8080 assembly language, could be ported to the new platform relatively easily. Porting 8080 code to the 68000 means rewriting everything; porting that same code to the x86 at least makes it possible to reuse some code -- because the x86 assembler can grok 8080 assembly language. Yes, you have to deal with the x86 segmented memory model, and with the differences between the CP/M system calls and those of MS-DOS, but those chores are still a lot less onerous than having to rewrite *everything*.

        Neither Intel nor Microsoft "got lucky" when IBM defined the PC architecture. Those were the technologies that made the most sense at the time.

    • by jo42 (227475) on Friday May 01, 2009 @09:26PM (#27794869) Homepage

      They completely missed the 74XX [] series of chips. So much stuff was built with them back in the day...

    • by Mr Z (6791) on Saturday May 02, 2009 @01:28AM (#27796179) Homepage Journal

      I'm not convinced. Some of these were just lucky, and rode the wave when the world shook, as opposed to shaking the world. The 555? Yes, truly sublime. The 741 op-amp? So fundamental, you couldn't imagine the world without it. But the 6502? A lucky near-clone of the 6800 that was popular not because it was particularly innovative, but because it was cheap. The 8088? The bastard stepchild of the 8086 which lucked out in getting picked over the 68000 in the IBM PC.

      Others are just interesting historical detours. Deep Blue and Transmeta Crusoe both were very interesting technologically, but they are in some sense interesting historical cul de sacs. The Explorer [] and related LISP machines [], Intel's iAPX432 [], and the INMOS Transputer [] also hang out in this neighborhood.

      DMD? Ok... that one always felt as if it was a project that succeeded only by application of the principle that with sufficient thrust, any pig will fly.

      Anyway... I guess any list like this is subjective.

      • "But the 6502? A lucky near-clone of the 6800 that was popular not because it was particularly innovative, but because it was cheap. The 8088? The bastard stepchild of the 8086 which lucked out in getting picked over the 68000 in the IBM PC."

        The article is entitled, "25 Microchips That Shook the World". The criteria is chips which were influential in their impact. That doesn't necessarily mean they had clean or clever designs, or were particularly innovative, or even "good" by any objective measure. It means that they mattered in the course of industry.

        You dismiss the 6502 because it's only innovation was low cost. That still counts, and arguably more than most other distinctions. The Ford Model T, the Apple II, the IBM-PC clones, even boo

    • by labnet (457441)

      Great list, but I would add a few old chips

      78 series voltage regs. (ie LM7805) All that logic needed sweet regulated power supplies, and 78 series of regulators are still a great choice for a cheap linear reg.
      4000 series logic. What made these chips great was you could operate them from 3v to 15v, thus a great choice for hobbiest battery powered projects.
      8051 Intels first mass microcontroller (also made by phillips/nxp)
      LM324 My old preffered cheap op-amp over the 741 as it was single supply and lower curren

    • 12 - Transmeta Corp. Crusoe Processor (2000)

      Yah OK technically not the 21st century. :)

      But that's it? We haven't managed to come up with some new weirdo bizzaro way of adding numbers since 2000?

      29 years 25 cips. 9 years nothing.

      Are we at our limit? Or has comp sci evolved to some new level where the instruction set doesn't matter?

      • Re: (Score:3, Insightful)

        Nah, you no longer need to be extremely inovative to produce awesome chips. These are the days of cheap transistors on nm manufacturing scales. We have gotten to the point where a clever way of doing something is obsolete, because its faster and cheaper to throw more transistors at our designs and resuse all our existing "cleverness".

        There is something about limited resources that encourages amazing innovation. When we have "enough", why innovate?

    • What? Where is SID?
  • All of them great (Score:5, Insightful)

    by Kell Bengal (711123) on Friday May 01, 2009 @07:24PM (#27794055)
    Even as a modern EE/robotics guy I use some of those parts today (555 timers in particular). I can't imagine the pain you'd have to go to to do some of the things they were used for in their heyday with discrete transistors and passive components.
    • Re:All of them great (Score:5, Interesting)

      by kbob88 (951258) on Friday May 01, 2009 @07:41PM (#27794169)

      I'm continually amazed at all the stuff people get the 555 to do. Just google '555 circuit', and be prepared for some major geek accomplishments.

      • Re: (Score:3, Interesting)

        by noidentity (188756)
        The chip gives you a set of building blocks [], so there is great flexibility in how you can combine them. There's probably some similarity to good software API design here, where you provide orthogonal features that the user can combine however he likes, allowing a small API to provide lots of functionality.
    • by frieko (855745) on Friday May 01, 2009 @08:02PM (#27794293)
      As a young whippersnapper I imagine the pain of reading the 555 datasheet whenever I flash a timer to an 8-pin microcontroller ;)
      • Re:All of them great (Score:5, Informative)

        by NoMaster (142776) on Friday May 01, 2009 @08:16PM (#27794411) Homepage Journal

        As an old fart, I wonder why you'd rather use a microcontroller with all the attendant pickyness over I/O and supply voltage stability and noise and costing > $1 in bulk, over a 555 that'll work in fairly noisy conditions from 5~15v and costs a few cents in bulk.

        Horses for courses; just try getting your microcontroller to do something like flash an LED in a car without all the extra supply regulation and filtering. A 555 will do it with 6 additional components, including the LED, for less than $1 ;-)

        • Re:All of them great (Score:4, Interesting)

          by NoMaster (142776) on Friday May 01, 2009 @08:34PM (#27794531) Homepage Journal

          (Actually, you've just inspired me. Someday I'm gonna build a calculator, based on a 8 pin micro, to display the optimum R1, R2, & C for a given frequency on an LCD screen.

          I might even throw in calculation of values for monostable and bistable mode ;-)

        • by frieko (855745)
          PIC10F200, 41 cents
          5 volt regulator, 16 cents
          Vdd capacitor

          To each his own; I was really just looking for an excuse to use the word 'whippersnapper'. (And as a coincidence I do have a uC in my car, it fools my crappy tape deck into thinking my iPod is the factory trunk-mount cd changer.)
          • Re:All of them great (Score:5, Interesting)

            by NoMaster (142776) on Friday May 01, 2009 @09:14PM (#27794799) Homepage Journal

            Sorry, should have said AU$ ;-)

            A PIC 10F200 costs AU$1.24 in > 25 quantities, compared to an NE555 at AU$0.429 / unit, AU$0.351 > 10+, or AU$0.26 > 250+

            And yeah, I was just poking fun at whippersnappers who automatically put a micro into everything. Don't forget to amortise the cost of your programmer hardware & coding time ;-)

            You also forgot the Vcc cap - don't worry, so did I with my mental zener-based supply. Don't want your regulator latching up or self-destructing on +- supply spikes, do you? ;-)

            (Aside: I once built a set of Knightrider lights for my car (OK, OK - feel free to poke fun at me for that but, in my defence, it was the 80's ;-) based on a 555, a BCD up/down counter, and a BCD-decimal decoder. I didn't filter the supply well enough, but that had the advantage of when it started working erratically by skipping lights or suddenly reversing direction, I knew it was time to change the distributor points ;-)

            • by Temkin (112574)

              I didn't filter the supply well enough, but that had the advantage of when it started working erratically by skipping lights or suddenly reversing direction, I knew it was time to change the distributor points ;-)

              I'll wager a 555 in 8 pin TO can that half of Slashdot knows nothing of distributors or points. ;-)

            • by Mr Z (6791)

              I started building such a thing myself, but I used a 4 bit binary counter, 74138 decoder and some XOR gates. The 4 bit counter always up-counted, and you just XOR the 4th bit with the other three to get the back/forth oscillation. The whole thing drove some super-bright LEDs. And, of course, a 555 provided the clock.

              I never finished them, though I do still have the parts somewhere. I breadboarded the digital section and verified it, and soldered together a few LEDs worth of the analog portion.

              Yes, I actu

          • Re: (Score:3, Interesting)

            by Linker3000 (626634)


            LM3909 Chip (sadly discontinued)
            2 x Resistors for Vcc > 6v

            • Can't believe some third party didn't take over production of these excellent chips.

              They are still available (new old stock) but prices are rising steadily.
        • Re: (Score:2, Informative)

          by evanbd (210358)

          The PIC does it with three external components -- a regulator and a capacitor for power, and a resistor to help drive the LED. If you run at lower supply voltages you can omit the resistor and use the output impedance of the PIC instead, provided you don't care about tweaking the power consumption. Lower parts count and less board area is cheaper, and the PIC is only marginally more expensive than the 555.

          And not only is the PIC cheaper, it can do a better job for most circuits. It will operate a more ac

          • Re: (Score:3, Informative)

            by mako1138 (837520)

            The PIC requires some infrastructure, though: compiler/assembler, programmer. The 555 requires no external programming.

            That said, it's amazing what you can do with a dirt-cheap microcontroller these days.

        • Re: (Score:2, Insightful)

          by servodave (812645)
          TRUTH! I've used both AVR's and 555's extensively. 100's of circuits with each over the years. Micros have their place, but they are too picky about too many things. The 555 is bulletproof and listing it as #1 is very appropriate. All hail the lm555.
        • Re: (Score:3, Interesting)

          by mikael (484)

          My hardware engineering professor once told us this story.

          One time the air force were looking for a visual system to detect airmen who had parachuted into the ocean. The requirements were that the visual system should have a 180 field of view in order to detect a single point of orange to a distance of several miles, be able to work within a fixed temperature range, require the minimum of maintenance and be vibration resistant. Two solutions were proposed.

          The first system was a real-time video system with m

          • Re: (Score:2, Insightful)

            by aynoknman (1071612)
            Yeah but cleaning the bottom of the cage under the real-time video system with multiple processors and cameras is easier.
        • by Dan East (318230)

          Because a true geek would not just want the LED to flash on and off symmetrically, but instead flash their name in Morse code. Try doing that with a 555. ;)

      • Re: (Score:3, Insightful)

        by evanbd (210358)

        The operational basics of the 555 are completely explained by a half-page functional block diagram. You can easily fit all the important max ratings and speeds and such on the other half of the page. Even the 10F200 has a 96 page data sheet (though to be fair, to be that thorough about the 555 would probably require 2 or 3 pages, not just one).

        The PIC has a lot going for it when compared to a 555, but simplicity is not one of those things.

  • by DudeFromMars (1097893) on Friday May 01, 2009 @07:25PM (#27794061)
    If the 6502 was good enough for Bender, why did they bother with anything else?
  • by toejam13 (958243) on Friday May 01, 2009 @08:05PM (#27794325)

    Interestingly enough, when Bill Mensch and company designed the 6501 (and later lawsuit modified 6502), they purposely made it very easy to expand it for future use. Although the chip was original designed for use in embedded solutions, several reports suggest that Bill Mensch, as well as fellow designer Chuck Peddle, saw the possibilities of the personal computer. This was around the time that the Altair 8800 was just released.

    Bill Mensch attempted to push Commodore for features that might be useful for a personal computer. However, Commodore management rebuffed him. Supposedly frustrated that Commodore management was as short sighted as the Motorola management that he had fled from just a few years earlier, Bill Mensch went on to start his own company designing successors to the 6502.

    Over at Western Design Center, Mensch and his sister designed the WDC 65C02, a bugfixed and enhanced version of the MOS 6502, that found its way into the Apple IIc and "enhanced" IIe. They also designed the WDC 65816, an extremely feature enhanced version of the 65C02 that included 16-bit index registers, 24-bit addressing, movable stack and zero page locations, and a host of new ops that allowed for jump tables and position independent code (useful with multitasking OSes and shared libraries).

    Just imagine if Commodore had the 65816 in 1980 and released a 16-bit successor to the PET that could handle up to 16MB without the weirdness of bank swapping or segmentation. It would have been very popular with programmers. Smoking the "what if" crack pipe even more, imagine if they ported TRIPOS to the 65816. :)

    Too bad they probably would have ruined it by bundling it with a chicklet keyboard.

    • by PhantomHarlock (189617) on Friday May 01, 2009 @08:24PM (#27794461)

      There's a pretty good write up of those days at MOS in the Rise and Fall of Commodore book that was reviewed here on Slashdot some time ago.

      I'm glad the 6502 made the list, along with the 68000 that the Amiga used so well along with Paula, Agnes, Denise etc and its successors the 68020, 040 etc. 8088 of course, and the 555 still in use today as others have mentioned. SPARC was pretty big in its day. Z80. ARM1. Those are the ones that stick out in my head the most.

      And yea the Crusoe, I dunno about that.

      It's amazing how most of these names are not much more than a word or phrase in the eyes of most people born in the 1990s or late 1980s. To us older chickens they were almost breathing, anthropomorphic beings because of how tightly you could weave assembly code around them and take advantage of their physical properties, bugs and nuances to perform hacks. When computers stopped being quaint hobby machines, they lost their soul. Early steam engines were similar, with highly polished brass, brightwork and victorian scroll work, imbued with the personality of their creators. When the railroads got real big, they became commodities, were painted black and weren't assigned a crew for life, so there was no pride of ownership. Now we are in the the era of the Dell box...I don't build my own machines anymore because it doesn't make any financial sense.

      Good times to remember.

    • Re: (Score:2, Interesting)

      by Tablizer (95088)

      Just imagine if Commodore had the 65816 in 1980 and released a 16-bit successor to the PET that could...

      According a quote from Chuck Peddle in "Rise and Fall of Commodore", he didn't see much of a demand or need for 16-bit processors. Commodore kicked around the idea of a 16-bit chip, but there didn't seem to be much enthusiasm for it, so it languished. What they saw was pressure for more peripheral features for less cost: cheaper floppies, harddrives, printers, color monitors, etc. And companies were findi

    • by metamatic (202216)

      Just imagine if Commodore had the 65816 in 1980 and released a 16-bit successor to the PET that could handle up to 16MB without the weirdness of bank swapping or segmentation.

      You mean like the Apple IIgs []?

    • I'm just going to randomly suggest everybody to take a look at "the 386 business case" from computer history museum hosted on youtube. Good stuff :)

    • Bill Mensch went on to start his own company designing successors to the 6502.

      Cyberdyne Systems?

  • Crusoe was a failure (Score:5, Interesting)

    by YesIAmAScript (886271) on Friday May 01, 2009 @08:08PM (#27794351)

    It was nothing special at all and it definitely didn't shake the world. It didn't lead to a bunch of devices using it and it didn't lead to a new path for computing

    The presence of this chip on here makes no sense to me.

    Oh wait, I just got to where they talk about a Micronas MP3 decoding chip. So I guess this list is a little hit or miss.

    I could hardly agree more with the Chips & Technologies AT chipset being on this list. It may have been more important to the success of the 8088 than the 8088 itself was. All of a sudden making a PC clone was easy, and inevitably it became the standard, so standard that now even Macs use the PC architecture.

  • Motorola 68k (Score:5, Insightful)

    by newcastlejon (1483695) on Friday May 01, 2009 @08:27PM (#27794483)
    Seriously! How many of us learned assembly with a 68k? How many are in service today. It's like the Mini/Beetle/Model T of the chip world: cheap, simple and with a practically cosmopolitan distribution.
    • by Slur (61510) on Friday May 01, 2009 @09:24PM (#27794857) Homepage Journal

      Yep, I learned my first Assembly Language on the 6502 back in 1983 or so, and had just started writing cool, fast game and utility software on the Atari 800 around 1985 using the very nice Atari Macro Assembler, when *boom* the era of Atari was over.

      So I moved to the Amiga and programmed that lovely machine in 680x0 assembler using the slick "DevPac" programming environment by HiSoft. Bad geek that I was, I never learned Intuition or any of the Amiga system calls, but went straight to the hardware for the titles I worked on, namely "Dino Wars" and "Bill 'n' Ted's Excellent Adventure" (apologies for both). Then *boom* the Amiga was dead.

      After a long hiatus from programming I got a PowerMac. On the Mac the first software I bought was the fringe macro assembler "Fantasm" by Lightsoft, thinking I'd be a Mac Assembler guru, but alas, Apple had moved from 680x0 to the PowerPC by that time, and only insane maniacs program that chip directly in Assembler.

      So finally, in 1995 I finally learned C, and a few years later C++.

      Of course nowadays I learn a new programming language every year and an entirely new framework or API every couple of months.

  • by Man On Pink Corner (1089867) on Friday May 01, 2009 @08:29PM (#27794501)

    Too awkward to compose a URL at the moment, but if you're a pro or more-advanced hobbyist you should google the 555 chip's designer, Hans Camendzind . He released a nifty book on basic analog IC design that never got the attention it deserved IMHO. I believe it's downloadable as a PDF from his site.

  • 8088 - Gakk! (Score:5, Interesting)

    by swm (171547) * <> on Friday May 01, 2009 @08:31PM (#27794517) Homepage

    The 8088 is a twisted, flawed architecture.

    In true QWERTY fashion, it got a lock on the market by solving an immediate problem: the need to get beyond a 16-bit address space in a single-chip microprocessor. We are hamstrung by its limitations to this day.


    Limitations of the IBM PC Architecture


            The Curse of Segments []

    • Re:8088 - Gakk! (Score:5, Informative)

      by Thomasje (709120) on Friday May 01, 2009 @09:14PM (#27794797)
      What rock have you been living under? The linked rant/article is from 1992! Contrary to what it says, the limitations of the 8088 architecture *were* overcome by the 386, but that article was written before DOS extenders allowing protected-mode applications became common, never mind Windows adding protected-mode support. The Windows world has had a flat address space for many years now, and the segmented aspects of x86 are only supported for non-performance-critical legacy code.
    • Actually, one could make the argument that we do not have enough segments. Were there more segments available within an application, you could have theoretically eliminated some sorts of attacks caused by buffer overruns.

      Looking back at the time, going from segments to flat was a godsend. However, going from segments to selectors would have been probably better from a security standpoint, although computers would be slower.

      • Re: (Score:3, Interesting)

        by TheRaven64 (641858)

        Mod parent right up. With the 8086, segments were actually useful, they were just (effectively) start addresses in memory and you had direct access to the next 64KB with 16-bit pointers. The 386 kept segments, but made them useless. You get 8192 global ones and 8192 local (per-process) ones. This is enough for some tricks, for example putting the stack and heap in separate segments so that they can grow independently, but not enough for anything really fun. With a bigger LDT you could put every object

        • by tjstork (137384)

          Just imagine a world where a PC's computer hardware had the same range checking and memory security as what is done in Java and C# today. That would completely kick ass.

          With the amount of memory that is out there today, you could have really giant LDTs...

  • The article is "Microchips That Shook the World", not "Great microchips"

    The 80386 along with the C&T chipset ignited the IBM PC clone industry.

    • by NoMaster (142776)

      The C&T chipset certainly gave it a bump-along, but there were IBM PC clones around well before the 80386. Sales of original 8086/8088 genuine IBM PCs were dwarfed by sales of 8086/8088/V20-based clones, well before the 80386 was even developed.

  • OK, not a chip but a chip family - but surely one that, perhaps even more than the 7400 series, influenced an entire generation of engineers and circuit designs. It really was the first major series that allowed you to pretty much bolt together designs, lego-fashion, from building blocks without really worrying about interfacing too much. In comparison the 7400 series was much fussier with limited fanout and fan-in, and a fixed 5v supply. CMOS was BASIC to the 7400's COBOL.
  • LIke the ix432 ( check my nick.. you will understand ), AMD 29xx. Video chip sets too, like the ET4000 which brought 'accelerated vga' to the masses. Eproms.. .

    That is the problem with any list, its YOUR list.

    But i agree with most of it.

    • Yup, it's a pretty good list but a little short on details. The ARM1 only merits a single paragraph, which is a shame because the story behind it is one filled with some typical British arrogance which turned out to be justified. They looked at the RISC designs coming out of UCB and decided that, if a bunch of American students could design a CPU, so could a handful of British engineers.

      They tangentially mention the Newton, but miss the part this played in the existence of ARM as it is today. Apple wa

  • Zilog Z80 (Score:2, Interesting)

    by TW Burger (646637)
    I wrote an operating system and hardware drivers for a Z80 based embedded system in 1986. It was and still is a great processor as long as you only need 8 bits.
  • For example, the video chips that launched a revolution. From SGI's original graphics accelerators through the Amiga's "fat agnes" to the early nVidia and ATI cards.

    But I do admit I like the fact they included the 555 and 701. Such fond memories breadboarding with those things...

  • by toejam13 (958243) on Friday May 01, 2009 @09:20PM (#27794835)

    For as groundbreaking that the ARM processor series, it was beat to the punch by DARPA. Not only did they help give us the Internet, they also helped with the evolution of chips that power your PDAs and smart phones that use the Internet.

    Now for a trip back in time... supposedly during the late 1970s, processor design was starting to hit the limits of manual design. These were still the days of designing a microprocessor on paper. The military, a huge consumer of microchips at the time, decided to sponsor research into the creation of standardized processes for microprocessor design. The result was DARPA's VLSI Project. Standford, UNC/Chapel Hill, Berkeley and others were involved.

    Numerous products and organizations came out of the VLSI Project. The BSD fork of AT&T's System-V saw major use and evolution. Networked CAD systems matured, specifically using the Stanford University Network (SUN) workstation, which was commercialized by Sun Microsystems.

    Most relevant to the article, though, was the advancement of the "RISC" design. During the 1970s, researchers noted that highly orthogonal processors (where every type of operation, such as ADD, SUB, SHIFT, XOR, etc..., can be used with any kind of memory operator, such as direct, indirect, indexed, etc...) were somewhat wasteful. The vast majority of operations were rarely used. If you restricted those operations to register-only ops, you could really simplify the processor.

    RISC architectures are less memory efficient than CISC architectures, something that was important in the 1970s, a time when dinosaurs roamed and 4KB Altairs roamed the world. They are also more tedious to program using assembly languages, also an issue during the 1970s when higher-level language compilers were rather unoptimized. However, by the time that the VLSI Project came around, these limitations were going away.

    Since RISC processors are so much easier to design than CISC processors, researchers used their groovy new tools to design one. So in 1982, the DARPA RISC-1 was born, which had less than half the number of transistors as the Motorola 68000. It also ran circles around the 68000. A year later, the RISC-II was released. It was three times as fast as its predecessor.

    The RISC design was also a huge advancement for researchers over at Standford. John Hennessy over there was trying to design a new processor that exploited the concept of pipelining. The problem, however, is that CISC instructions have variable (and often long) execution time. This can cause the pipeline to stall since the processor runs dry on data to execute. RISC design solves that problem because most of the operations, with exception of memory load/store ops, are short and quick. Hennessy borrowed these "new" concepts and came up with the MIPS architecture, one of the first popular RISC designs.

    Not much later, Acorn Computer, looking to replace the MOS 6502 processor but dissatisfied with the Motorola 68000, National Semiconductor 32016 and others, went looking for a new chip in 1983. They traveled to the States and visited Western Design Center. Seeing how "simple" it was to design a processor, they brainstormed up the concept of the ARM1.

    The ARM probably would never have been designed without the advances that came out of the VLSI Project. The ARM2, the first production unit, only contained some 30K transistors. The DARPA RISC-I was 44K while the RISC-II was reduced to 40K. The 68000 was a whopping 70K transistors.

  • by BikeHelmet (1437881) on Friday May 01, 2009 @10:24PM (#27795191) Journal

    I'd rather find out about interesting and unique chips, rather than ones that "shook the world".

    Like the Propellar [], with its interesting interrupt handling, and non-stamped [] design.

    • Re: (Score:3, Interesting)

      by BikeHelmet (1437881)

      Damnit, I clicked Submit rather than Continue Editing!

      This isn't so much impressive hardware, as impressive software: []

      FMV on an 8088!

      Okay, I admit, the quality/resolution isn't that good, but it's still fascinating. :P

    • by evanbd (210358)

      The Propeller looks really interesting. I might get one to play with, but I'm disappointed by the lack of Linux support.

      I'm also surprised they don't have *any* hardware peripherals on-chip. I'm used to working with the PIC microcontrollers, which give me tons of things like UARTs, USB interface, SPI controller, CAN bus, A/D converters, timers, PWM output, comparators, etc. Obviously some of that can be implemented with one of the cogs, but some of it would be hard or impossible. The lack of hardware mu

      • I don't know if it's fast enough for linux. It really is a microcontroller - not a full fledged CPU or SoC. It only has something like 32KB of RAM built-in. You might get one of those DOS-style assembly operating systems to run on it, but probably not linux.

        I'd rate it somewhere up there with all those AVR CPUs used in projects like the Arduino [] - except with different capabilities.

        Their website has a lot of downloadable code on it. Do some research and see if anyone has written code for what you need. :)

        • by evanbd (210358)

          I meant Linux development support -- it would need a lot more memory and an MMU to get Linux running on it.

          As a 32-bit device at 20 MIPS per core and 8 cores, it has significantly more performance than the 8-bit AVRs (40 MIPS max, I believe) and PICs (12 MIPS) in most applications. The PICs will outperform it if you need a lot of multiplies -- hardware multiply of 8 by 8 to 16 bit result in a single cycle; that can do a 16x16 to 32 in about 2us, iirc. A single propeller core looks like it would be a bit s

    • The interrupt-handling design on the propeller isn't particularly novel, unless you restrict yourself to PC hardware. Mainframes and supercomputers often handle interrupts like this, on a separate, dedicated, interrupt controller. A typical IBM supercomputer, for example, has one or two PowerPC 4xx series chips for handling I/O without the need to interrupt the main CPU. The Wikipedia article is also slightly misleading, when it talks about saving state and then restoring it to handle an interrupt. A lo
  • While it's relatively new compared to everything on this list, the AMD Opteron, which came out in 2003, will be the face of computing for the foreseeable future. Even now in 2009 AMD's archival Intel is just coming out with integrated memory controllers and high-speed serial direct interconnects. The Opteron also forced Intel to give everyone 64-bit memory addressing in x86 (which Intel wanted to stay in the realm of high-end RISC/Itanium machines).

    Opteron wasn't the first chip to have any of these featur

  • They could not include every processor, of course, but this was a nice piece of hardware at the time. The Heathkit microprocessor trainers used it (programmed it to play Anchor's Away! as extra credit for retired Navy prof), had accum A, B, and an index register/addressing (the first uP to do so?), 16 bit regs, flat memory space and memory mapped I/O. Preceded the later 6502 which had a similar programming model. It was clean and fun to learn; the Intel architecture has always been foreign to me; was there

  • I would make the argument that if you were going to pick an Intel CPU, that "shook the world", it would be the 80386 more than the 8088. Dubbed, the mainframe on the chip, it more or less lived up to its hype. Following in the wake of the 80386 came Linux and Windows NT... essentially server operating systems running on a desktop.

    The 8088, conversely, was just another personal computer chip. It had some advantages but didn't really change the sorts of operating systems you could make with it. Atari, Appl

    • by GrahamCox (741991)
      Nah. The 386 was only Intel getting its act together - finally - after the debacle of its earlier efforts and finally waking up to the far cleaner design that was the 68000 family. The 68000 was truly a "mainframe on a chip", it was designed precisely to look like the architecture of a PDP-11, and its design began in 1976. Intel took nearly twenty years to catch up.

      Atari, Mac, Amiga? They all used the 68000 family, because you could write a GUI-based OS to run on it. By contrast, the Intel chip held back
      • by tjstork (137384)

        If IBM had chosen the 68000 for its PC project, things may have turned out very, very differently.

        Yeah, it would have and it could have. But then, it would have undermined what could have been achieved with the Amiga, ST and Mac....

        The reason I gave the 386 the plug was I felt the memory management and hardware paging were more central to having a multiuser OS than more registers... Amiga was cool but I've had enough GURU MEDITATION ERRORS for one lifetime. The system was just not stable.

      • The 68000 didn't have virtual memory so can't be compared with the (later) 80386. The 68010 (which I programmed on in the 80s in a Torch XXX) had virtual memory and was the 'mainframe on a chip'. Also I question this idea of IBM considering the 68000 for the IBM PC. The 68k had a 16-bit data bus so would have meant a more expensive mainboard design than the (8-bit bus) 8088. The Motorola chip to compare with the 8088 is the 68008 (Sinclair QL anybody?).
  • PowerPC (Score:3, Insightful)

    by GrahamCox (741991) on Saturday May 02, 2009 @02:21AM (#27796407) Homepage
    The PowerPC should really be there. Not so much for its use in the Mac, but because it's so widespread in the embedded world. In fact, I think it's the most used embedded architecture by far. You might not think of your car or washing machine as "world-changing", at least not for their electronics, but actually the reliability of modern devices is largely down to this. The PPC must be one of the most common "invisible" bits of technology that most people actually use.
  • It's "outside the box", not "out of the box".

    "Out of the box" is pretty much synonymous with "off the shelf."
  • In 1997, a company in Sunnyvale, named Silicon Spice, created an amazing device. It had over 80,000,000 transistors, and replaced one or more huge (over 4 foot square) motherboards manufactured by ATT. The chip had a hundred simple cores, and massive amounts of peripheral network management and computing power.

    It was called Callisto, and it was bought by Broadcom. Three genius, genius engineers from MIT created it, and it allowed high performance signal processors to be implemented in software. In fact it a

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker