Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
GUI Software Businesses Apple

How the Lisa Changed Everything 194

Sabah Arif writes "The Lisa, started in 1979 to provide an inexpensive business computer to Apple's lineup, enjoyed little success. With its advanced object oriented UI and powerful office suite, the computer was priced well above the means of most businesses. Despite its failure, the Lisa influenced most user interfaces, and introduced many features unheard of in earlier systems (like the Xerox Star or VisiOn). Read the story of the development and demise of the Apple Lisa at Low End Mac."
This discussion has been archived. No new comments can be posted.

How the Lisa Changed Everything

Comments Filter:
  • by enigma48 ( 143560 ) * <jeff_new_slash AT jeffdom DOT com> on Sunday October 09, 2005 @10:40AM (#13750716) Journal
    /. already posted this story http://apple.slashdot.org/article.pl?sid=05/07/31/ 1346224&tid=190&tid=3 [slashdot.org] a few months ago. In their defense, the old article was hosted at Braeburn.ath.cx (but looks like they've redone their website and braeburn resolves to lowendmac.com).
  • by Anonymous Coward on Sunday October 09, 2005 @10:40AM (#13750718)
    Haha.. reminds me of this joke:

    Knock knock..

    Who's there?

    (wait 45 seconds) ... ..... .......

    Lisa!

  • You got to wonder (Score:5, Interesting)

    by Stevyn ( 691306 ) on Sunday October 09, 2005 @10:42AM (#13750726)
    How would things be different today if Apple initially offered the Lisa at a substantially lower price just so people experienced the GUI? IBM and the clones were much cheaper, so businesses probably chose initial cost over an interface that could have lowered training costs and increased productivity. And if people were using Apple machines at work, then they would have bought an Apple for home later on.
    • { wonders idly } Has anyone doen a Lisa emulator for OS X ?

      Lisa 7/7 under OS X... that would be trippin !
    • How would things be different today if Apple initially offered the Lisa at a substantially lower price just so people experienced the GUI?

      Maybe there would be no Apple, because they would have lost their shirts selling expensive hardware at a loss. If comparable hardware could be produced at a reasonable price back then, Lisa would have had much more competition. But most computer makers knew the time wasn't yet right.

      I disagree with the basic notion that the Lisa influenced everything to follow beca

      • As another example, I don't think the modern game hardware industry owes much to Silicon Graphics - their hardware was ahead of PC hardware for a long time, but at a price most people wouldn't pay. When the time was right, graphics hardware became widespread, but it was no thanks to SGI who were trying to maintain the old prices.

        Are you kidding me? The modern game hardware industry owes a lot to SGI.

        First, let's talk about SGI's direct effect on the industry:

        SGI designed the N64. It was basically all the
    • The IBM PC wasn't successful because of its price (the Apple II was cheaper and more open) or because of its technology. It was successful because the nameplate on the front had a big, prominent IBM logo.

      If the Lisa had been an IBM product, it would've been vastly more successful. Same with the Macintosh, Commodore 64, anything. Businesses bought IBM, because it was IBM and because IBM knew how to support corporate customers. Such was the power of the brand in those days, and there's nothing like it in the
  • by gilesjuk ( 604902 ) <giles@jones.zen@co@uk> on Sunday October 09, 2005 @10:44AM (#13750730)
    The Apple Lisa was way too expensive and rather slow. The Mac was much cheaper and worked much better. Hence why you buy an Apple Mac not an Apple Lisa when you go to an Apple store now.

    It was Steve Jobs who brought us the Mac too, he recognised it was the right product and better than the Lisa. Much of the work on the project until Jobs took over was done by Raskin and his team.
    • Actually, the Lisa was out for a year before the "Baby Lisa" (Macintosh) came out. As for why people would use Lisa rather than Macintosh? Well, for one thing, Lisa came with great software (in some ways better than what's available even today). Macintosh, on the other hand, had at best, MacWrite and MacPaint and little else that was available until a year later. There were no native dev tools so there were no in-house apps and no liklihood of them. There was no hard drive. The computers had 128K of TOTAL R
    • It was Steve Jobs who brought us the Mac too, he recognised it ...

      Hang on there. Steve Jobs can take credit for seeing the Macintosh project taken to market, but Jef Raskin deserves all the credit for the initial concept (and name, complete with misspelling) and early development. It was only after Jobs was refused the position of project leader for the Lisa that he came across the small Mac research project, consisting of about 10 people Raskin had collected, and set about forcibly taking it over.

      Fear

      • by Anonymous Coward
        Jef Raskin's total contribution to the Macintosh project was "Hey, let's build an inexpensive computer that's easy to use." That's it. He had absolutely no technical input whatsoever, and no input at all once Jobs took over the project.

        People love to latch on to Raskin and call him the unsung hero, but the fact is that he totally abandoned the project when he didn't get to do everything his way.
    • The Mac was not the better computer at the time. If it had not been for the laser printer, the Mac would have died too. Remember the Mac had only a single floppy drive and 128K ram. Then came the 512K Mac and second floppy (a vast improvement but still not really usable in a office setting). Every once in a while you would have to swap floppies out of the drives (20 swaps sometimes to save a document, yes really!)

      The Mac did not truely become usable in the office until the Mac Plus came out. (My extern
  • Ah, Memories (Score:5, Informative)

    by rob_squared ( 821479 ) <<rob> <at> <rob-squared.com>> on Sunday October 09, 2005 @10:47AM (#13750740)
    I remember the good old days, back when the Apple computers were simpler. When the mouse only had 1 button.

    I kid, I kid.

    Anyway, here's a picture of the orgional ad: http://www.jagshouse.com/lisabrochure.html [jagshouse.com]

  • Jobs didn't get it. (Score:3, Interesting)

    by Baldrson ( 78598 ) * on Sunday October 09, 2005 @11:00AM (#13750784) Homepage Journal
    As I described here years ago: [slashdot.org]

    Gutting programmer effectiveness and routing new programmers into BASIC by a factor of at least 10 while maintaining, and even slightly improving the GUI is a great example of "not getting it". You can say OOP would become important in a few years and I can say the windowing GUI would become important in a few years with or without Jobs. But the revolution had already occured at PARC (and if you're focused on the mouse environment -- even a decade earlier at SRI which is where PARC, and indeed PLATO with its touch panel [thinkofit.com], got their inspiration -- I remember sitting in meetings at CERL/PLATO viewing the films of SRI's research in 1974 as part of PLATO's computer-based conferencing project).

    DOS applications were starting to pick up on it despite the horrid CGA they had to work with initially -- and it wasn't because Jobs did the Mac. The Windowing GUI was inevitable and obvious to people with money as well as most personal computer programmers, especially once Tesler had already popularized it with his 1981 Byte magazine article [byte.com].

    Dynamic, late-binding programming environments that highly leverage the sparse nerd matrix out there -- like Smalltalk, Python, etc. -- are, however _still_ struggling to make it past the concrete barriers Jobs poured into the OO culture with the Mac.

    When Jobs passed up Smalltalk for Object Pascal, and then again, with Next, passed up Smalltalk for Objective C, he set a pattern that continues to this day when Sun passed up that sun-of-Smalltalk, Self [sunlabs.com] and went with that son-of-Objective-C, Java.

    Gutting the superstructure of technology while maintaining appearances isn't leadership.

    • by Baldrson ( 78598 ) * on Sunday October 09, 2005 @11:08AM (#13750832) Homepage Journal
      Here is the base message [slashdot.org] originating the "Jobs Didn't Get It" exchange:

      When Jobs brought technology in from Xerox PARC, and Adobe, he had the keys to the kingdom handed to him on a silver platter:

      1) A tokenized Forth graphics engine.

      2) Smalltalk.

      The Forth graphics engine was originally intended to grow from a programmable replacement of the NAPLPS videotex graphics protocol, into a silicon implementation of a stack machine upon which byte codes, compiled from Smalltalk would be executed. At least that's the direction in which I had hoped to see the Viewtron videotex terminal evolve when I originated the dynamically downloaded tokenized Forth graphics protocol as a replacement for NAPLPS in 1981 and discussed these ideas with the folks at Xerox PARC prior to the genesis of Postscript and Lisa.

      If Charles Moore could produce an economical 10MIPS 16 bit Forth engine on a 10K ECL gate array on virtually zero bucks back then, why couldn't Jobs with all his resources produce a silicon Postscript engine with power enough to execute Smalltalk?

      Somehow a Forth interpreter made it into the first Mac, as did Postscript, but Smalltalk just didn't.

      The Motorola 68000 family just didn't have the power. It may have been better than the Intel 86 family, but that really isn't saying much, now is it?

      • An addendum (Score:3, Interesting)

        by Baldrson ( 78598 ) *
        While Jobs had his failure of vision in not pursing a hardware stack machine similar to Moore's Tesler also should have pushed back on Jobs harder to get some form of Smalltalk onto the Lisa even with the Motorola chip. The reason is that there are optimization techniques involving type inferencing and dynamic code generation that had been researched and to some extent exploited at PARC, and have certainly become a mainstay of the JVM today. If the software engineering resources that were to be invested i
      • Hindsight is 20-20 (Score:3, Insightful)

        by Thu25245 ( 801369 )
        So Jobs didn't get it.

        Gates didn't get it.

        Sun didn't quite get it.

        But we, with the full benefit of hindsight...we get it. Just like those little-known geniuses, writing papers in the bowels of university research labs and Xerox PARC. We get it now. We are so friggin' smart. So much better than those short-sighted billionaires who pillaged and plundered the ideas of their betters twenty years ago.

        We get it. We are so brilliant. We totally rock.
    • Objective-C's object system and general philosophy is _very_ smalltalk-ish.

      Java is much more "C++ with some warts removed" than an Obj-C derivative. Obj C _is_ a "dynamic, late-binding programming environment." C++ and Java are not.

      Self is no more a son of Smalltalk than Java is a son of Obj C. They (Self and Smalltalk) both came out of PARC, but they are very different.

      I suspect you have no more idea about what went on at Apple than you do about programming languages, but I can't speak to that myself.
      • Objective-C's object system and general philosophy is _very_ smalltalk-ish.

        Yes, Objective-C has late binding and limited dynamic typing, but that's only a tiny part of what Smalltalk offered in 1980.

        The sad fact is that it is Apple, more than any other company, that is responsible for the bloated toolkits and libraries that we see today: they set the pattern with the Lisa and the Macintosh toolboxes, and everybody else copied it, up to and including Java and C#. And while Objective-C happens to copy some
        • "The sad fact is that it is Apple, more than any other company, that is responsible for the bloated toolkits and libraries that we see today..."

          Oh please. Probably the Apple II or the orginal IBM PC were the last micro's where you could count on knowing the in's and out's of every byte running on the machine. And even those had BIOS's.

          Once you get into interactive GUI's, you don't want every single application having to reinvent Quickdraw, or a menuing system, or a windowing system, or a file system, or

      • I don't know why some people think Self is a very different language than Smalltalk-80. With the optional parser and GNU Smalltalk classes you can even file in Smalltalk-80 code and run it.

        http://research.sun.com/self/papers/smalltalk.pdf [sun.com]

        I don't remember who said "Self is like Smalltalk, only more so" but that is a great definition. To avoid having to repeat this discussion all the time I have renamed my Self/R project as Neo Smalltalk.
      • Patrick Naughton wrote [umd.edu]:

        ...When I left Sun to go to NeXT, I thought
        Objective-C was the coolest thing since sliced bread, and I hated C++.
        So, naturally when I stayed to start the (eventually) Java project, Obj-C
        had a big influence. James Gosling, being much older than I was, he had
        lots of experience with SmallTalk and Simula68, which we also borrowed
        from liberally.

        The other influence, was that we had lots of friends working at NeXT at
        the time, whose faith in the black cube was flagging. Bruce Martin was
        wor

    • by Tablizer ( 95088 )
      You have to remember that the microcomputer hardware was sloooow and expensive back then. Compiled "static" languages simply run faster, even if they take longer to program with.

      Personally, I feel that GUI's should be mostly declarative based such that one stores descriptions and attributes of windows and widgets rather than use boatloads of "new Window(...)" and "new Widget(...)" commands in code. Events are then bound to a programming language of choice. Declarative approaches are usually easier to adapt
    • Whoa, hello. Apple had a BUDGET to work with. Xerox (for all practical matters) didn't. What did a PARC workstation cost? $25,000+? Something like that?

      While I'm sure that, philosophically, you're correct (and I'm not going to really debate them, because half of what you said is gibberish to me), but you have to remember that Apple is a corporation... they can't spend millions of dollars and years and years of R&D to get some high lofty goal of computing done, they had to get a product out the door
  • by Anonymous Coward on Sunday October 09, 2005 @11:05AM (#13750808)
    Article is hostile : mc68000 was 32 bit proccessor because it used 32 bit registers, had 32 bit math, and address more than 16bit addressing in linear addressing and used address registers than held 32 bits.

    the DATA bus and code bus used 16 wires.... b ut it was a goddamned 32 bit chip and this fact used to piss off intel x86 people for many years.

    So much so that they try to rewrite history with articles like this crap that ignore that the chip was 32 bits.

    A 64 bit processor for example DOES NOT have 64 bit data bus lines typically to the actual motherboard ram, and certainly NEVER EVER offers all 64 bit of addressing. (possibly some offer 48 in this universe though).

    but does that mean a 64 bit chip is not a 64 bit? no!! Jsut as the 68K was a genuine 32bit chip and almost no effort was needed when a full 32 bit wired version was offerred for sale.

    The article is hostile to history of the mac and lisa.

    by the way i bought both the years both shipped.

    • by NormalVisual ( 565491 ) on Sunday October 09, 2005 @12:34PM (#13751222)
      Interestingly, the article also refers to the 8088 as a 16-bit processor, which is an 8-bit processor if one uses the same criteria that you'd have to in order to call a 68000 "16-bit".

      68000: 32-bit registers, 24-bit address bus (linear addressing), 16-bit data bus
      8088: 16-bit registers, 20-bit address bus (segmented addressing), 8-bit data bus

      I frankly don't consider the 8088 and 68000 even remotely comparable - it's far easier to program for (and design hardware around, IMHO) the 68K. The only difficulties that I knew of anyone really experiencing when moving to the 68020 and other full 32-bit variants was that people had gotten into the really bad habit of using the upper 8 bits of the A registers for general storage, which would break things on a '020 horribly. Even so, it was certainly nothing like the EMS/XMS hell that PC programmers had to go through just to use memory above 1MB because of the limitations of the 8088 memory architecture.
      • Interestingly, the article also refers to the 8088 as a 16-bit processor, which is an 8-bit processor if one uses the same criteria that you'd have to in order to call a 68000 "16-bit".

        No, it makes the rather confusing assertion that:

        1) Apple decided that a 16-bit machine was necessary for what they were trying to accomplish.
        2) Apple considered many CPUs for the LISA, including the 8088.

        So, you see, the article doesn't state that the 8088 is a 16-bit CPU, it just states that Apple considered it despite earl
    • by tjstork ( 137384 ) <todd.bandrowskyNO@SPAMgmail.com> on Sunday October 09, 2005 @02:11PM (#13751661) Homepage Journal
      Time to bust out the holy wars.

      I like the 68000 because it has so many registers but I think all in all in the 80386 is the better CPU.

      For reference, consider:

      http://www.freescale.com/files/32bit/doc/reports_p resentations/MC680X0OPTAPP.txt [freescale.com]

      http://www.df.lth.se/~john_e/gems/gem0028.html [df.lth.se]

      http://linux.cis.monroeccc.edu/~paulrsm/doc/trick6 8k.htm [monroeccc.edu]

      http://www.csua.berkeley.edu/~muchandr/m68k [berkeley.edu]

      Right off the wheel, we notice that the 68000 did not support 32 bit multiplecation at all. Doesn't sound too much like a 32 bit chip to me. Compare that to Intels quirky IMUL, which I believe puts the result into EAX, EDX to get a real 64 bit result.

      Integer math was faster clock for clock on the 386. Compare things like 68K register addition to Intel register addition. There's no comparison.

      Compare

      http://www.gamedev.net/reference/articles/article2 14.asp#ADC [gamedev.net]

      to

      http://www.df.lth.se/~john_e/gems/gem0028.html [df.lth.se]

      Whenever you did any 32 bit pointer math on a 68k, you paid a huge, huge performance penalty. It was always more efficient to do things in 16 bit PC relative addressing.

      The 68K had no concept of isolated memory or tasks. So systems like the Amiga and the Macintosh would run without any isolation between processes. I was an Amiga fan boy and I used to get that GURU meditation error so much that it was not even comical.

      The tragedy of the 386 architecture was actually Microsoft and not Intel. DOS and Windows did not use even the 386 chip to its fullest capability for memory management. MS users would have to wait until Sept 1995, almost 10 years after the 386, for a true 32 bit operating system.

      • Of course the 386 was better than the 68000 - it was considerably newer! The 68000 was a mainstream processor when Intel's 8086 was Intel's mainstream processor.
      • by Misagon ( 1135 ) on Sunday October 09, 2005 @03:22PM (#13752073)
        You are comparing a newer generation to an older.
        The 68000 came in 1979. The i386 was introduced first in 1986.
        The 68020 however, introduced in 1987 did support 32x32->64 bit multiplication and division between all data registers. An external MMU was available, but it was unused by MacOS and AmigaOS.
        And the 68000 has had nice relative addressing modes from the start. I don't understand what you are referring to. (I have written machine code for all of these.)
        • mod parent up (Score:3, Informative)

          There is a huge 6 year gap between the two. I could say a powerpc G5 processor is alot better than an 8086. Well of course it is. But does that mean a fast Athlon64 is slower than a G5?

          By the time the 80386 came out, Motorrola had 60020's and perhaps 68030's.

          PS the 68020's and I think the 68000's could run Unix because of built in memory protection and other features. Could 8088's, 8086's, 80186's or 286's do that? No I do not Consider early SCO XENIX aka Openserver a real unix with built in memory protecti
        • It's an academic point. The mainsteam computers that used these parts were only a two years apart, and that's a lot different.

          Apple Macintosh was introduced in 1984
          Commodore Amiga was introduced in 1985
          Compaq 386 introduced in 1986

      • The comparison was between 8086 and 68000, i.e. Mac vs PC XT, not with 80386.

        Even with 80386 though, PCs were inferior to Macs, because there was no proper 32-bit protected-mode O/S for 80386 (unless you count expensive Unix).

        Let's not forget that the 68000 was the CPU used in most coin-ops...the most impressive of which was Outrun and other SEGA super-scaler games. All the SEGA machines used two 68000 plus a group of custom chips to do the graphics job...the 68000 was particularly co-operative with co-proc
        • Even with 80386 though, PCs were inferior to Macs, because there was no proper 32-bit protected-mode O/S for 80386 (unless you count expensive Unix).

          Firstly, there was OS/2 and Windows NT or more esoteric choices like NeXTSTEP. Not to mention that as of Windows/386 (ca. 1988) the OS was becoming "32 bit".

          Secondly, MacOS didn't have protected memory at all (until OS X, ca. 2000) and wasn't "32 bit" until System 7, in 1991 (and then only on "32 bit clean" Mac hardware).

          In technical terms, DOS-based Window

          • Firstly, there was OS/2 and Windows NT or more esoteric choices like NeXTSTEP. Not to mention that as of Windows/386 (ca. 1988) the OS was becoming "32 bit".

            Given that NT wasn't introduced until almost 10 years after the Mac, and only a year before Apple went to the PPC architecture, I don't think it really can be considered here. OS/2, while a true pre-emptive multitasking system (and a damn good one IMHO), didn't come out until about the time the second-generation Macs did (1987), and didn't even hav
            • Given that NT wasn't introduced until almost 10 years after the Mac, and only a year before Apple went to the PPC architecture, I don't think it really can be considered here. OS/2, while a true pre-emptive multitasking system (and a damn good one IMHO), didn't come out until about the time the second-generation Macs did (1987), and didn't even have a GUI until more than a year later.

              The comment I was replying to was talking about the 386, and PCs being "inferior" because "there was no proper 32-bit protec

          • Firstly, there was OS/2 and Windows NT or more esoteric choices like NeXTSTEP. Not to mention that as of Windows/386 (ca. 1988) the OS was becoming "32 bit".

            OS/2 was in the design stage when the 80386 came out (in 1986). Windows NT was only a thought in the back of Microsoft's mind. NextStep did not run on 80386. There was no Windows/386 fully 32-bit product then.

            Secondly, MacOS didn't have protected memory at all (until OS X, ca. 2000) and wasn't "32 bit" until System 7, in 1991 (and then only on "3

      • Why compare two processors from totally different time periods?!
      • MS users would have to wait until Sept 1995, almost 10 years after the 386, for a true 32 bit operating system.

        This is not correct. Their first choice would have been OS/2, followed closely by Windows NT. Not to mention that as of Windows/386 (ca. 1988), DOS-based Windows was starting to become "32 bit" (was using >640k RAM, could pre-emptively multitask DOS boxes, etc).

        DOS-based Windows was always an evolving product. To say "Microsoft users" didn't get any 32-bit goodness until Windows 95 is simpl

        • I primarly meant process isolation for mainstream computer users. People buying a computer at the local store were not going to get it preloaded with OS/2 or WNT. Windows 95 was the first consumer MS product that offered both a 32 bit OS, and, had some degree of process isolation.

    • Actually, there were significant problems with the 68020, because many developers used the top byte of addresses for flags. But they were warned against it pretty much from day 1.

      (Emacs, BTW, was a major early offender, because the default assumption it made was that you could use part of every pointer for flags. I think it still does this, rather than defaulting to separate pointers and flag bits except on platforms where the assumption has been tested.)
  • LISA (Score:3, Interesting)

    by hhawk ( 26580 ) on Sunday October 09, 2005 @11:17AM (#13750873) Homepage Journal
    The Lisa wasn't cheap unless you were comparing to some mainframe. We had one at Bell Labs when I was there. Did some graphics on it, which was easier than trying to do graphics with TROFF/PICS...

    but it was also always breaking needing service and it didn't get a lot of use..

  • to find out she has a penis.
  • by AshPattern ( 152048 ) on Sunday October 09, 2005 @11:25AM (#13750917) Homepage
    My father, an early adopter-type, had a Lisa for his office, and it was the Lisa that I first learned how to program on.

    One of the most maddening things about programming the Lisa was that you couldn't make programs that integrated well with the Lisa office suite. Why? Because there was no API for the GUI. None. If you wanted a window drawn, you fired up QuickDraw and drew it yourself. Want a scroll bar? Do it yourself. Menus? Right.

    I ended up only using the development environment's console for my programs' interfaces. The development environment was also console based, probably for the same reasons. A couple of years later, Apple released the Lisa Toolkit that had all that stuff, after they had announced they were going to discontinue it.

    So in my opinion, it was the lack of software that killed the Lisa, not its high price. I mean, people were paying for it, and they wanted more. The ability to use proportional fonts was the killer feature to end all killer features.

    It's worth noting that Apple learned its lesson about making developers happy - the developer support program for the Macintosh has been one of the best.
    • It's worth noting that Apple learned its lesson about making developers happy - the developer support program for the Macintosh has been one of the best.

      I was just reading an article that took a different view with respect to video and video hardware, claiming that Apple provides NO support for a third party to make their own video card drivers.
  • by toby ( 759 ) * on Sunday October 09, 2005 @11:41AM (#13750988) Homepage Journal
    Marcin Wichary [aresluna.org] has compiled a great deal of Lisa information [guidebookgallery.org], from screenshots, ads, brochures and articles to posters and videos, at his site GUI Gallery Guidebook [guidebookgallery.org]. Recent postings include 17 exclusive Lisa posters [guidebookgallery.org] for download and enjoyment, and an interview with Dan Smith [guidebookgallery.org] that reveals "The original trash can for Apple Lisa was supposed to have been an old, beat up alley trashcan, with the lid half open, flies buzzing around it and appropriate sounds as user put something inside."
  • by idlake ( 850372 ) on Sunday October 09, 2005 @12:13PM (#13751132)
    I had a Lisa, and Apple made the same mistakes with the Lisa as Xerox had made with the Star: it was too expensive, in particular for the limited hardware and completely incompatible software you got.

    Claims that the Lisa represented significant technological innovation seem dubious to me. You need to compare the Lisa to the totality of R&D efforts around at the time, not just the Star. Xerox alone had Alto, Star, Smalltalk, and probably others. The GUI of the Lisa was an evolutionary change, and not always for the better; what was under the hood of the Lisa can charitably be described as pedestrian. It took Apple 20 years to catch up and finally adopt system software that even is in the same league as Smalltalk-80 (that's "80" as in "1980"; Smalltalk-80 is the language and platform that Objective-C and Cocoa are modeled on).

    Lisa's main significance was to be a prototype for, and cannibalized for, Macintosh (and it served as the main development machine for Macintosh apps for a while), but I can't think of any significant new technology it introduced.
    • Making a computer easy for the masses isn't a "technology," but it's still damned important. You talk from the typical open-source "code is everything, usability unimportant" viewpoint but you have to remember that for the average user, if they can't use a feature, that feature might as well not exist. Now you're right in that the user experience for a few reasons didn't really gel until MacOS 4 or so, but the Lisa was a thousand times better (for users) than anything that came before it.

      Alto was, from ev
      • but the Lisa was a thousand times better (for users) than anything that came before it.

        I think you underestimate how slow the Lisa was, the GUI was of no importance when waiting for it to do anything was an exercise in frustration.

        TWW

    • I can't think of any significant new technology it introduced.

      Ummm.... menus?

  • by Tablizer ( 95088 ) on Sunday October 09, 2005 @12:34PM (#13751226) Journal
    OOP is a technique for organizing programming code, not UI's. Thus, what exactly is an OO UI? I am not sure if there is only one way to interpret a UI analog to programming code code techniques. In fact, nobody can even agree on a clear definition of OO in the code world. If you want to start a bar fight in OOP forums, ask for a precise definition of OO, and e-chairs start flying.
    • It is confusing, but they mean that the user interacts with documents ("objects"), instead of directly with applications or the Filer:

      Dan Smith, a major contributor to the desktop interface, had created the Lisa's first interface, the Filer. The Filer asked a user a series of questions about what task the user wanted to accomplish, and when enough questions were answered, the tasks were executed and the Filer shrank to the background.
    • Thus, what exactly is an OO UI?

      OS/2's Workplace Shell is generally considered the best example.

  • by Animats ( 122034 ) on Sunday October 09, 2005 @12:42PM (#13751255) Homepage
    It's worth remembering that the original Macintosh was a flop. The attempt to cost-reduce the Lisa resulted in a machine too weak to do much of anything. Remember the original specs: 128K, no hard drive, one floppy. Ever use one? Ever actually try to get work done on one? You had to fit the OS, the app, and your documents on one floppy. Or you could get an external floppy, which made the thing marginally useable. It was cute, but not productive.

    The lack of a hard drive was the killer. By the time the Mac came out, IBM PCs had a hard drive, so Apple was playing catch-up. Apple had tried building hard drives (the LisaFile), but they were slow and crashed frequently. But at least the Lisa had a hard drive. Third parties added a 10MB hard drive to the Mac [atarimagazines.com] in early 1985, which brought performance up to an acceptable level. Some people say that third-party hard drives saved the Mac. But Apple fought them tooth and nail. Apple finally came out with a 20MB external hard drive for the Mac in 1986. This was very late; IBM PCs had been shipping with hard drives for five years.

    Sales for the Mac were well below expectations. Apple had been outselling IBM in the Apple II era. (Yes, Apple was once #1 in personal computers.) In the Mac era, Apple's market share dropped well below that of IBM.

    What really saved the Mac was the LaserWriter, which launched the "desktop publishing" era. But that required a "Fat Mac" with a hard drive and 512K. By then, the Mac had reached parity with the Lisa specs, except that the Lisa had an MMU and the Mac didn't. The Lisa also had a real operating system, with protected mode processes; the Mac had "co-operative multitasking" in a single address space, which was basically a DOS-like system with hacks to handle multiple psuedo-threads.

    The MMU issue was actually Motorola's fault. The 68000 couldn't do page faults right, and Motorola's first MMU, the Motorola 68451, was a terrible design. The Lisa had an Apple-built MMU made out of register-level parts, which pushed the price up.

    Apple might have been more successful if they'd just stayed with the Lisa and brought the cost down as the parts cost decreased. They would have had to push Motorola to fix the MMU problem, but as the biggest 68000 customer, they could have.

    • by PapayaSF ( 721268 ) on Sunday October 09, 2005 @01:59PM (#13751604) Journal
      The Lisa, like other computers of the day, had rectangular pixels [lowendmac.com]. The Mac's introduction of square pixels allowed true WYSIWYG, and was crucial to desktop publishing and computer art. The Mac's still strong position in the graphic arts industry is a direct result.
      • Comment removed based on user account deletion
      • How does square pixels allow true WYSIWYG? The screen representation is still an approximation (unless you are printing on a B&W printer with 72DPI). One thing that we've learnt since is that tall pixels are better value, as the human eye needs greater horizontal resolution than vertical (c.v. cleartype, lcd mode in freetype, or whatever). I would rather have a 2:1 tall-pixel display than a /2:/2 once the resolution goes above 100dpi - better visual resolution for a given investment in pixels. Quick
    • Some corrections (Score:3, Informative)

      by flimflam ( 21332 )
      The Fat Mac had neither a hard drive nor cooperative multitasking (unless you count desk accessories, but the original Mac had those too). There was Switcher which gave the ability to switch between apps, but there was no multitasking -- the background apps were completely suspended. Cooperative multitasking didn't come 'til Multifinder with System 6.

      Internal hard drives didn't come 'til, I want to say Mac II? Was there one for the SE?

      • Was there one for the SE?

        The SE/30 had an internal hard drive, but was produced after the Mac II and was really just a Mac II in an SE case.

      • Internal hard drives didn't come 'til, I want to say Mac II? Was there one for the SE?

        There was a Mac SE FDHD [everymac.com], but that wasn't until 1989. The Macintosh Plus [everymac.com], released in 1986, was the first Mac with a SCSI port, and Apple sold a matching hard drive. That was the first Mac with a supported hard drive. There had been previous third-party attempts to add a hard drive, but they either required internal mods or worked, slowly, through the printer port.

        Multifinder originally was an add-on for System 4. I

  • hmm, it would be realy interesting if this was brought back again.

    today its to much focus on what apps one use, not what one want to do.

    i all to often see people having photoshop installed when all they do is look at digital photoes...
    • Start a MS office 2004 application. (for Mac) It starts with a window with a bunch of documents of various types for you to chose from. This is borrowing from the older Works interface.

      For example, I open PowerPoint and I get a Window with a bunch of document templates on it. If I select Excel Workbook, Excel opens up with a blank workbook and PowerPoint goes into the background.

      On the Lisa, you have a folder call Stationary that has blank copies of documnents. You just double click on one and the appl
    • hmm, it would be realy interesting if this was brought back again.

      Back ? UIs have been moving more and more away from being application-driven and towards being document-driven for the last 25+ years. Methinks you never really used computers before the 90s :).

      today its to much focus on what apps one use, not what one want to do.

      I think you've got that backwards. Interfaces today are *very* much document-centric, not application-centric.

      That many users treat them mostly as application-centric, is no

  • Another good site is (Score:2, Informative)

    by stevey ( 64018 )

    Another good site full of first-hand descriptions of how early Apple development was done is http://folklore.org/ [folklore.org].

    I've never owned a Mac, and am too young to have been involved in earlier developments - but that site does make it all seem very impressive.

  • I think it's kind of cool how in the Lisa ad below:

    http://www.jagshouse.com/images/lisa4.jpg [jagshouse.com]

    The real world images above the icons look like the photorealic icons used in OS X (no shit, yes they're real, but i.e. the lighting, camera angle, etc.)

Keep up the good work! But please don't ask me to help.

Working...