Forgot your password?
typodupeerror
IBM Technology

Review of IBM's Original Personal Computer 154

Posted by Soulskill
from the won't-run-crysis dept.
illiteratehack was one of several readers to point out that today is the 30th anniversary of the introduction of IBM's first popular PC, writing, "V3 managed to dig up the original review of IBM's Personal Computer Model 5150, the machine that popularized personal computing. There are some great comments; the article's author wasn't sure if IBM would sell the PC outside the US, and he mentions the inclusion of a 'very high quality 11.5-inch' display. The article also shows that while the PC may have changed a lot on the inside, the way it was reviewed hasn't changed much in 30 years." Other readers sent in reflections on 30 years of the PC by various tech icons and a speculative look at what the computing industry would have looked like without IBM.
This discussion has been archived. No new comments can be posted.

Review of IBM's Original Personal Computer

Comments Filter:
  • Scroll Lock! (Score:5, Insightful)

    by Sebastopol (189276) on Friday August 12, 2011 @12:24PM (#37069632) Homepage

    FTA:

    "However, a mysterious key called Scroll Lock doesn't actually do anything."

    30 years ago... as useless then as it is now.

    • Re:Scroll Lock! (Score:4, Informative)

      by conares (1045290) on Friday August 12, 2011 @12:32PM (#37069724)
      From Wikipedia:

      The Scroll Lock key was meant to lock all scrolling techniques, and is a remnant from the original IBM PC keyboard, though it is not used by most modern-day software. In the original design, Scroll Lock was intended to modify the behavior of the arrow keys. When the Scroll Lock mode was on, the arrow keys would scroll the contents of a text window instead of moving the cursor. In this usage, Scroll Lock is a toggling lock key like Num Lock or Caps Lock, which have a state that persists after the key is released.

      • I think the only application I've ever used that supports scroll lock usage is Lotus Notes, which sucks.
        • It still functions in Excel, literally blocking scrolling, which, when I've accidentally hit the damned button, really annoys the living crap out of me.

          • Ah, I forgot about that one.. The annoying thing about Notes is if you've typed less than a full page, then accidentally hit the scroll lock key, it just seems like your arrow keys aren't working. drove me crazy the first time it happened
            • by Belial6 (794905)
              Yeah, it must be the software that sucks. After all, it is doing exactly what you told it to do instead of what you wanted it to do. All those programs that do what you tell them to do suck. It isn't you at all. It's the software.
              • Troll harder, spanky
                • by Belial6 (794905)
                  Says the guy bagging on software for doing exactly what he told it to do. Heck he even used the key with the instructions printed right on it, and he still blames the software.
                  • Says the guy that doesn't understand what "accidentally" means
                    When applications rarely use scroll lock, it's not that obvious why all of a sudden the arrow keys stop working. The way I phrased my first comment made it sound like the scroll lock thing sucks, but I meant Notes just sucks as a whole. I only meant it was an annoyance. I've been using the software for at least 7 years and have found plenty to hate about it.
                    Also, just because a program does what you told it to do doesn't mean it's designed wel
                    • by Belial6 (794905)
                      Notes predates proper security on the desktop. That is why it has a lock key. The F5 being the refresh key is a MS thing, while Lotus has always been cross platform. On top of that, even MS doesn't stay consistent with F5. So, your complaints boil down to "I don't understand it, so it sucks". I didn't misunderstand your statement. You pressed a key, and it did what it was supposed to do. Blaming the software for that is stupid.
        • by X0563511 (793323)

          Linux and BSD consoles use it too. Interestingly, it "locks" the scroll, so you can actually read kernel messages (use shift with page up/down to scroll in either direction while locked)

          • by hedwards (940851)

            That was my thought, it works differently under Linux than under BSD, but the functionality is there.

            I tend to get annoyed by Logitech and the other idiots that remove those "superfluous" keys as they're not always superfluous. It annoys me that my Thinkpad lacks a pause button, the same one that's win + pause to open up that menu.

            • by X0563511 (793323)

              I thin on Linux you just don't use shift, where you have to use shift under BSD. Not sure. And I agree wholeheartedly about keyboard mangling... so irritating.

    • Civilization (Score:4, Interesting)

      by XanC (644172) on Friday August 12, 2011 @12:37PM (#37069792)

      I remember in the original Civilization, if you had Scroll Lock on, the arrow keys would show you around the map rather than moving the active unit.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Useful in FreeBSD console. Hit scroll lock to scroll through terminal with arrow keys like xterm scrollbar.

    • by tlhIngan (30335)

      "However, a mysterious key called Scroll Lock doesn't actually do anything."

      30 years ago... as useless then as it is now.

      Not really. Depending on your usage, scroll lock is very useful at controlling a KVM. It can also be very useful if you deal with very large spreadsheets - even today you can hit scroll lock and then use the cursor keys to scroll through the document rather than use the scroll bars or mouse. Think of it as the "mousewheel" for the keyboard.

      It's a shame more apps don't use it - if you want

      • by antdude (79039)

        Same here. I use KVMs from Y2K days and they still are useful and work well.

      • by Dogtanian (588974)

        Depending on your usage, scroll lock is very useful at controlling a KVM.

        Oh Christ, yeah... *this*.

        I read the original comment thinking "yeah, I don't think I've *ever* used Scroll Lock" (aside from trying it out). Then I read your comment and realised that, actually, I use it *every sodding day* I'm at work!

        To be fair though, this isn't really the original or intended usage- and I sort of keep it mentally separate from "actual" keyboard usage (which explains my original oversight), since it's a sort of hack enabling convenient switching of the KVM. And I'm pretty sure that they

    • by asdf7890 (1518587)

      FTA:

      "However, a mysterious key called Scroll Lock doesn't actually do anything."

      30 years ago... as useless then as it is now.

      It is useful in many speadsheet applications. With it on Excel, for instance, will scroll your viewport without altering your cursor position. I'm sure I've seen it used the same way in some turn-based strategy games too.

    • Part of coming of age is actually find uses for that key. Clue: check Maniac Mansion (or was it Indiana Jones and the Fate of Atlantis?) and Excel.

    • To be honest, I wish more programs would include Scroll Lock support. The intended functionality is actually rather handy if you don't want to or can't use a mouse.

    • by sootman (158191)

      "And now, continuing my review of every key on the keyboard, the next one is called 'caps lock.' HMM, WHAT DOES IT... OH MY GOD! THIS IS FUCKING SWEET!! I AM GOING TO USE THIS ALL THE TIME!!!!!"

      And now here's some plain text to work around the lameness filter. I hope it goes by percent of capital letters and not by sheer number. Here's hoping...

    • by jafac (1449)

      I just looked down at my keyboard, and I was like; "WTF?!" I *do* still have one! Right between SysRq and Pause/Break. I've been using computers since 1980 (TRS-80, Apple ii, PLATO... then IBM PC's), and I can't recall a single time any of these three bitches ever did a damn thing! EVER!

  • by kheldan (1460303) on Friday August 12, 2011 @12:25PM (#37069646) Journal
    My first PC was built on an XT clone motherboard. Being an electronics tech and having built the S100 bus-based computer I'd been using for years, I decided to borrow a desoldering station from work over a weekend, and desoldered every chip on the motherboard so I could install sockets for all the chips against the eventual need for troubleshooting and repair. I never did have to replace a single chip on that board the entire time I used the thing.
    • by idontgno (624372)

      Ow. My eyes water at the thought. If the BOM of that clone was anything like that of the original 5150 (and they usually were), the motherboard had as many as 100 DIP ICs.

      That's a metric butt-ton of solder wick. I'd be crosseyed and incoherent after desoldering and resoldering over 1600 through-hole pads. And with my luck, I'd damage at least one of the ICs in the process, probably one of the harder-to-come by chips (like the 8288 bus controller), and maybe one or more of the solder pads too.

      Good work on th

      • by kheldan (1460303)
        As stated, I used a desoldering station, which has a vacuum pump, not something like a Soldapullt or similar manual desoldering pump. No way I'd do that manually!
  • If you go to the page and wonder where the text is: You have to enable JavaScript for the site to even get it displayed.
    Of course all the other stuff gets displayed even without JavaScript ...

  • the machine that popularized personal computing

    I tend to think that the Apple II had a hand in popularizing personal computing

    • by xero314 (722674) on Friday August 12, 2011 @12:48PM (#37069964)

      I tend to think that the Apple II had a hand in popularizing personal computing

      You can think that, but the reality is that the personal computing revolution did not begin until the arrival of the commodore 64.

      • by GreatDrok (684119)

        It depends on where you were living I guess. I was in the UK at the time and my school got the first computer in the county in 1979 - a Commodore Pet 3008. That was the first machine I learned to program but the Commodore BASIC was feeble at best. A year later I bought a Sinclair ZX80 and then 81 and really got stuck into programming. The BASIC wasn't much better than Commodore though and I wanted more than the '81 could offer so was looking at the VIC20 (still that nasty BASIC) and then the 64 which wa

        • by jedidiah (1196)

          You can hardly create a consumer revolution if no one can buy your stuff.

          Apple was certainly at the head of the pack but their stuff was rediculously priced and actually prevented more people from getting in on the action.

          Apple was still selling it's 8-bit kit into the 68K era with prices higher than machines meant to compete with the Macintosh.

        • It depends on where you were living I guess. I was in the UK at the time

          You're right as far as pointing out that the UK was somewhat different. But some aspects of your summary are open to question.

          Most seriously, the fact that you completely fail to even *mention* what was AFAIK far and away the best-selling computer in the UK during the 1980s- the ZX Spectrum [wikipedia.org]- renders your summary misleading by itself. The Spectrum's influence on bedroom programmers is often credited (correctly or otherwise) with kickstarting the strength of the early UK software industry.

          Yes, most of the

          • by GreatDrok (684119)

            The spectrum wasn't really relevant as a programmer's machine. Sure, it was popular but most people who had one never wrote so much as a single line of code and Sinclair BASIC was primitive. Better than on the 81 but still pretty lousy and your code had to be full of GOTO statements. The Spectrum and C64 were both much the same, a game platform and not relevant to computer literacy in the way the BBC and the cheaper sibling the Electron were.

            As for saying few homes had BBC micros, that is far from the tr

            • by Cederic (9623)

              I call bullshit. More people learned to program on C64 and Speccies at home than on a BBC model B.

              The C64 and Speccie were games platforms. Games were played by kids. Kids tried writing their own. Those kids became the games programmers and software engineers of the 90s.

              I know this, I'm one of them.

            • by Dogtanian (588974)

              The spectrum wasn't really relevant as a programmer's machine. Sure, it was popular but most people who had one never wrote so much as a single line of code and Sinclair BASIC was primitive.

              The sheer number of Spectrums out there meant that even if a low proportion of people actually programmed on them, that was still a large amount of people. I agree with you that a higher proportion of BBC owners probably used their machines for "serious" stuff (*), but that's still a higher proportion of a much smaller user base.

              You also forget that people wanting to get the most out of the machine used assembly/machine code, not BASIC.

              As for saying few homes had BBC micros, that is far from the truth.

              Maybe I overstated this, but the fact was that the BBC was *not* domin

    • by msauve (701917)
      ...and the Commodore PET, and the TRS-80, and the Sinclair ZX-80 and the Commodore 64.
      • by Chris Burke (6130)

        And the TI-99!

        Or was that just my first computer, bought from a garage sale as a birthday present...

    • by jgagnon (1663075)

      Let's not forget the Commodore 64! :p

    • Re: (Score:3, Insightful)

      by swordgeek (112599)

      Agreed. I think back to 'the day', and while I had an Atari 400 and worked with both PETs and IBMs, the Apple ][+ was probably the watershed machine.

      Half a decade in the future, the Commodore 64 sold more units but that's because computers were popular by that point. People WANTED them! Lots of people had been buying computers (usually horrible things - the Vic-20 or the TI-99/4A) because they were exposed to the Apple at work or at school, and when the C64 came along it pretty much wiped the floor with the

  • by Anonymous Coward on Friday August 12, 2011 @12:35PM (#37069772)

    Written by someone who was born the year the computer came out.

    Hands-On With the IBM 5150, Thirty Years Later [wired.com]

  • 30 years. Cool. That might be enough of a soak to get the bugs out.

  • by Wovel (964431)

    Microsoft, for example, was involved right from the beginning. However, at the moment the machine is only sold in the US. IBM will not say when, if ever, it will come to Britain.

    This paragraph is confusing. Did the reviewer believe Microsoft was a British company?

    • by Trixter (9555)
      No, the reviewer was British and wondering when they'd be available in the UK.
  • "(It’s fun to toy with the idea of us all using computers directly descended from the Commodore 64.)"

    I'm trying to imagine what a 64 bit descendant of the 6502 would look like...and it's not pretty :)

    • I'm trying to imagine what a 64 bit descendant of the 6502 would look like

      The 32-bit descendant of the 6502 is the ARM architecture. But half a year ago, ARM had no plans to expand from 40-bit to 64-bit [techspot.com], at least not until RAM hits half terabyte levels.

      • by itsdapead (734413)

        The 32-bit descendant of the 6502 is the ARM architecture. But half a year ago, ARM had no plans to expand from 40-bit to 64-bit [techspot.com], at least not until RAM hits half terabyte levels.

        When the ARM was launched in the late 80s it was a kick-ass desktop workstation processor that could wipe the floor with a 286. They even made an ARM "accelerator card" for the PC (see here [chriswhy.co.uk] and search for "springboard").

        In our IBM PC-free alternate universe, the ARM could have taken off on the desktop, inevitably migrated into servers, and would probably have got some 64-bit love rather earlier. Back in the real world, it survived by carving out a niche in mobile/embedded applications, which don't need 64

        • Back in the real world, it survived by carving out a niche in mobile/embedded applications, which don't need 64 bits.

          And I'd bet a lot of applications that aren't mobile or embedded don't actually need 64 bits in the first place. For example, The Legend of Zelda: Ocarina of Time made the jump from the 64-bit MIPS R4300 CPU in the Nintendo 64 to the 32-bit ARM CPU in the Nintendo 3DS because few parts of any N64 game actually used double precision.

      • by Danathar (267989)

        Is really a descendant? Or a distant relative? My research says that the ARM was designed with many of the same concepts and by generally the same group of people but it's not a direct lineage.

    • WDC did have some very rough plans to introduce a 32-bit "65832", it used some un-used opcodes from the 65816.
  • "However, a mysterious key called Scroll Lock doesn't actually do anything."
  • by UnknowingFool (672806) on Friday August 12, 2011 @01:01PM (#37070108)
    I distinctly remember it like this:

    Less space than a Cray. CGA at best. Lame.

    • by Dogtanian (588974)

      I distinctly remember it like this:

      Less space than a Cray. CGA at best. Lame.

      Funny, yeah- but it also would have been a fair review of the original PC. The IBM PC/MS-DOS and their derivatives became the de facto standard for various reasons- IBM nametag guaranteed initial success, generic PC was easily cloned, MS retained right to sell OS to other companies, latter two leading to open market, competition and commoditisation. But none of these have anything to do with the fact that the original IBM PC was a great or interesting machine, because frankly it wasn't.

      Unlike the iPod rev

  • by roc97007 (608802)

    To us computer geeks, the PC was underpowered and expensive even for the time. And that broken keyboard... Ugh.

    We got a few in at work when they first came out but weren't happy with them. It wasn't until clones with a 286 and Selectric-type keyboard started to become available that it really took off. (Wow, remember when a 286 was fast??)

    Your mileage, as always, may vary, I guess.

    And as far as the PC's role in popularizing computing, um, did I imagine those computer shows I attended before the PC came

  • Then, as now, people said 'it's too expensive...other machines (take your pick) are better/faster/cheaper.' Well, they were right, of course, but, as we now know, the IBM PC and its clones went on to absolutely destroy all of the other competition. Why? Were buyers just idiots who wanted the 'IBM' name on the front? Of course not. The reason the IBM PC and its clones went on to success was because they allowed businesses who were using typewriters, 'word processors (larger businesses),' and 'mini-compu

    • by jedidiah (1196)

      Nothing about any of the other available options prevented them for being put to business use.

      The only limiting factor was the lack of a respectable brand name like IBM.

      You're entire rant can be summed up as "no one ever got fired for buying IBM".

      Microsoft merely inherited the old monopoly.

  • by shoor (33382) on Friday August 12, 2011 @02:06PM (#37071162)

    If you can find an old computer magazine from the late 70s (BYTE, Dr Dobbs, Creative Computing, etc) you'll see ads for all kinds of different systems. It was like the early days of the automobile industry when there were many manufacturers that are all but forgotten now. Too many for it to last; there had to be what marketing people call a 'shakeout'. When IBM announced the PC, it legitimized these home computers in the minds of a lot of people who liked the idea of having a computer in their home with the 3 letters IBM on it.

    But they were expensive and soon people were buying the cheaper clones. As I understand it, IBM was still mostly interested in their Mainframe business. They left the PC's architecture 'open', which allowed the cheap clones to be made. This was a decision that had important consequences I think. If IBM had suppressed the clones, what would have happened? Perhaps Apple would have become top dog in the home PC market, or perhaps some other company. Would there have ever been any 'open' architecture at all? The openness was spoiled by Microsoft cutting deals with the hardware manufacturers of those clones so that no other software had much of a chance. My feelings about Microsoft should be clear from my sig.

    My big disappointment was that IBM chose to use the Intel 8086 chip. The Zilog Z8000 and Motorola 68000 were much more advanced, and I thought it was a pity that they became niche architectures by comparison. I realize IBM wasn't interested in creating something 'insanely great'. Mediocrity, or even downright inferiority prevailed. There were sound business reasons for IBM's decision at the time, but that doesn't mean I have to like the result.

    • Also keep in mind that many of the parts used in the IBM PC were shared with other product lines to keep costs down. Were they the best parts for the job? Not likely, but they were cheap and readily available to IBM at the time.
    • So true. The 68000 was such a better processor, for example.

      And for a few pennies more in resistors, no one would have had to experience IRQ hell as cards could have been self-configuring.

      IBM also should have shipper Forth as the OS, not DOS.

      Too bad Commodore was such a messed up company, too, as otherwise they had some great hardware and later software (like with the Amiga).

      I'm glad I've kept some of my old Byte magazines and others to remember that history, but I got rid of most of them, sadly. How quickl

    • by Waccoon (1186667)

      They left the PC's architecture 'open', which allowed the cheap clones to be made.

      It's important to note that IBM didn't have a choice in the matter, for the same reason Coleco was legally allowed to make a clone of the Atari 2600.

      The Mac required special ROMs, so cloning wasn't an option (not like some companies didn't try). The Amiga couldn't be cloned because of all the custom chips. The IBM PC was very easy to reverse engineer, and IBM couldn't do a thing to stop it (not like they didn't try, either).

  • I worked in an independent computer store when I was younger, starting in 1987, just on my 13th birthday. I was there on and off through high school and university for another 10 years until the PC Worlds and Games killed off the small indie. I would sell a lot of computers to people, mainly the Amiga, but some were hell bent on the ST, which I hated at the time.. Ironically I owned a few towards the end of the 90's for MIDI sequencing - but that's another story.. Anyway, to my shame, I do remember us gett
    • by Smauler (915644)

      I bought an Amiga... then went onto PCs... and here's why. My PC had 7 different autoexec.bat files, and 3 different command.com files for different games and startups.... it was a nightmare. I think it was mainly lack of funds and the fact I got my dad's old PC with a hard drive, just about when Civilisation and Doom came out that turned me (I know civilization was available on the Amiga too, but I didn't have a hard disk for it). When I got Doom networked on 4 computers in our house (damn ipx hack se

  • For all who haven't seen it yet:

    http://www.blinkenlights.com/pc.shtml [blinkenlights.com]

Time to take stock. Go home with some office supplies.

Working...