How the Lisa Changed Everything 194
Sabah Arif writes "The
Lisa, started in 1979 to provide an inexpensive business computer to
Apple's lineup, enjoyed little success. With its advanced
object oriented UI and powerful office suite, the computer was priced
well above the means of most businesses. Despite its failure,
the Lisa influenced most user interfaces, and introduced many
features unheard of in earlier systems (like the Xerox Star or VisiOn).
Read the story of the development and demise of the Apple
Lisa at Low
End Mac."
Knew I read this before (Score:5, Informative)
The Lisa Was a Very Slow Computer (Score:5, Funny)
Knock knock..
Who's there?
(wait 45 seconds)
Lisa!
You got to wonder (Score:5, Interesting)
Re:You got to wonder (Score:2)
Lisa 7/7 under OS X... that would be trippin !
Re:You got to wonder (Score:2)
Re:You got to wonder (Score:5, Informative)
Re:You got to wonder (Score:2)
Maybe there would be no Apple, because they would have lost their shirts selling expensive hardware at a loss. If comparable hardware could be produced at a reasonable price back then, Lisa would have had much more competition. But most computer makers knew the time wasn't yet right.
I disagree with the basic notion that the Lisa influenced everything to follow beca
Re:You got to wonder (Score:3, Interesting)
Are you kidding me? The modern game hardware industry owes a lot to SGI.
First, let's talk about SGI's direct effect on the industry:
SGI designed the N64. It was basically all the
Not IBM? No sale. (Score:2)
If the Lisa had been an IBM product, it would've been vastly more successful. Same with the Macintosh, Commodore 64, anything. Businesses bought IBM, because it was IBM and because IBM knew how to support corporate customers. Such was the power of the brand in those days, and there's nothing like it in the
not really (Score:2)
Re:not really (Score:4, Informative)
High-end PC: 640 KB RAM. No further expansion possible
LISA: 1 MB RAM, expandible to 2 MB.
High-end PC: One 360K floppy, one Hard Drive
LISA: Dual 860K floppies, 5 MB Hard Drive.
High-end PC/Hercules Graphics: 720x348 bin-mapped display (plus 80x25 chararcter)
LISA: 720x364 bit-mapped display
Re:not really (Score:2)
All for about $4800!
Re:You got to wonder (Score:3, Insightful)
Re:You got to wonder (Score:2)
True - but touchscreens (along with mice and all other pointing devices I've used) are a RSI disaster.
Also - even though they are initally easier, GUIs tend to make repetetive tasks more complicated - they are unsuitable for most POS applications.
A GUI doesn't have to be a windowing interface with a mouse, you know.
Indeed - and a text based interface can have alot of graphics - as long as it
Re:You got to wonder (Score:2)
I'd still call that a GUI.
To me, the best way of describing the difference between a CLI and a GUI is that with a CLI you tell the computer way to do and with a GUI you select one or more operations the computer gives you to choose from.
From a usability perspective, a CLI requires you to know what you want to do /and how to do it/ before you can use it, only offering reactive feedbac
Re:You got to wonder (Score:2)
We have gotten to the point today where we dismiss anything that doesn't use a window manager as a program without a graphical interface, but that's simply not true. It is quite possible, and was very common in the past (before said window managers existed) to create GUIs in an entirely fixed-width, text-based envir
Re:You got to wonder (Score:5, Informative)
But, as a former Amiga user, I'll still say that the OS 1.x interface wasn't the best GUI ever. They improved quite a lot with 2.x onwards, but that was five years or so later.
Re:You got to wonder (Score:2)
Lisa and macs, at least, were perfectly usable up to their limits (and the first 128k mac's limits were easy to reach).
Re:You got to wonder (Score:3, Interesting)
I'm not sure that the Mac/Lisa would have been focussed on those types of apps (let's face it, 2-colour monochrome isn't going to compete with 4096-colour HAM), so perhaps the comparison isn't entirely valid.
Re:You got to wonder (Score:4, Insightful)
Mac changed everything (Score:4, Informative)
It was Steve Jobs who brought us the Mac too, he recognised it was the right product and better than the Lisa. Much of the work on the project until Jobs took over was done by Raskin and his team.
Re:Mac changed everything (Score:2)
Re:Mac changed everything (Score:3, Informative)
Hang on there. Steve Jobs can take credit for seeing the Macintosh project taken to market, but Jef Raskin deserves all the credit for the initial concept (and name, complete with misspelling) and early development. It was only after Jobs was refused the position of project leader for the Lisa that he came across the small Mac research project, consisting of about 10 people Raskin had collected, and set about forcibly taking it over.
Fear
Re:Mac changed everything (Score:2, Interesting)
People love to latch on to Raskin and call him the unsung hero, but the fact is that he totally abandoned the project when he didn't get to do everything his way.
Re:Mac changed everything (Score:2)
The Mac did not truely become usable in the office until the Mac Plus came out. (My extern
Ah, Memories (Score:5, Informative)
I kid, I kid.
Anyway, here's a picture of the orgional ad: http://www.jagshouse.com/lisabrochure.html [jagshouse.com]
Jobs didn't get it. (Score:3, Interesting)
Gutting programmer effectiveness and routing new programmers into BASIC by a factor of at least 10 while maintaining, and even slightly improving the GUI is a great example of "not getting it". You can say OOP would become important in a few years and I can say the windowing GUI would become important in a few years with or without Jobs. But the revolution had already occured at PARC (and if you're focused on the mouse environment -- even a decade earlier at SRI which is where PARC, and indeed PLATO with its touch panel [thinkofit.com], got their inspiration -- I remember sitting in meetings at CERL/PLATO viewing the films of SRI's research in 1974 as part of PLATO's computer-based conferencing project).
DOS applications were starting to pick up on it despite the horrid CGA they had to work with initially -- and it wasn't because Jobs did the Mac. The Windowing GUI was inevitable and obvious to people with money as well as most personal computer programmers, especially once Tesler had already popularized it with his 1981 Byte magazine article [byte.com].
Dynamic, late-binding programming environments that highly leverage the sparse nerd matrix out there -- like Smalltalk, Python, etc. -- are, however _still_ struggling to make it past the concrete barriers Jobs poured into the OO culture with the Mac.
When Jobs passed up Smalltalk for Object Pascal, and then again, with Next, passed up Smalltalk for Objective C, he set a pattern that continues to this day when Sun passed up that sun-of-Smalltalk, Self [sunlabs.com] and went with that son-of-Objective-C, Java.
Gutting the superstructure of technology while maintaining appearances isn't leadership.
The original message. (Score:5, Informative)
When Jobs brought technology in from Xerox PARC, and Adobe, he had the keys to the kingdom handed to him on a silver platter:
1) A tokenized Forth graphics engine.
2) Smalltalk.
The Forth graphics engine was originally intended to grow from a programmable replacement of the NAPLPS videotex graphics protocol, into a silicon implementation of a stack machine upon which byte codes, compiled from Smalltalk would be executed. At least that's the direction in which I had hoped to see the Viewtron videotex terminal evolve when I originated the dynamically downloaded tokenized Forth graphics protocol as a replacement for NAPLPS in 1981 and discussed these ideas with the folks at Xerox PARC prior to the genesis of Postscript and Lisa.
If Charles Moore could produce an economical 10MIPS 16 bit Forth engine on a 10K ECL gate array on virtually zero bucks back then, why couldn't Jobs with all his resources produce a silicon Postscript engine with power enough to execute Smalltalk?
Somehow a Forth interpreter made it into the first Mac, as did Postscript, but Smalltalk just didn't.
The Motorola 68000 family just didn't have the power. It may have been better than the Intel 86 family, but that really isn't saying much, now is it?
An addendum (Score:3, Interesting)
Re:An addendum (Score:3, Informative)
Hindsight is 20-20 (Score:3, Insightful)
Gates didn't get it.
Sun didn't quite get it.
But we, with the full benefit of hindsight...we get it. Just like those little-known geniuses, writing papers in the bowels of university research labs and Xerox PARC. We get it now. We are so friggin' smart. So much better than those short-sighted billionaires who pillaged and plundered the ideas of their betters twenty years ago.
We get it. We are so brilliant. We totally rock.
you clearly have no idea what you're talking about (Score:3, Informative)
Java is much more "C++ with some warts removed" than an Obj-C derivative. Obj C _is_ a "dynamic, late-binding programming environment." C++ and Java are not.
Self is no more a son of Smalltalk than Java is a son of Obj C. They (Self and Smalltalk) both came out of PARC, but they are very different.
I suspect you have no more idea about what went on at Apple than you do about programming languages, but I can't speak to that myself.
Objective-C is not even close to Smalltalk (Score:2)
Yes, Objective-C has late binding and limited dynamic typing, but that's only a tiny part of what Smalltalk offered in 1980.
The sad fact is that it is Apple, more than any other company, that is responsible for the bloated toolkits and libraries that we see today: they set the pattern with the Lisa and the Macintosh toolboxes, and everybody else copied it, up to and including Java and C#. And while Objective-C happens to copy some
Re:Objective-C is not even close to Smalltalk (Score:2)
Oh please. Probably the Apple II or the orginal IBM PC were the last micro's where you could count on knowing the in's and out's of every byte running on the machine. And even those had BIOS's.
Once you get into interactive GUI's, you don't want every single application having to reinvent Quickdraw, or a menuing system, or a windowing system, or a file system, or
Self is a Smalltalk (Score:2)
http://research.sun.com/self/papers/smalltalk.pdf [sun.com]
I don't remember who said "Self is like Smalltalk, only more so" but that is a great definition. To avoid having to repeat this discussion all the time I have renamed my Self/R project as Neo Smalltalk.
Does Patrick Naughton Have No Idea Too? (Score:3, Interesting)
No, you don't get it. (Score:3, Informative)
Personally, I feel that GUI's should be mostly declarative based such that one stores descriptions and attributes of windows and widgets rather than use boatloads of "new Window(...)" and "new Widget(...)" commands in code. Events are then bound to a programming language of choice. Declarative approaches are usually easier to adapt
Re:Jobs didn't get it. (Score:2)
While I'm sure that, philosophically, you're correct (and I'm not going to really debate them, because half of what you said is gibberish to me), but you have to remember that Apple is a corporation... they can't spend millions of dollars and years and years of R&D to get some high lofty goal of computing done, they had to get a product out the door
Re:spot on (Score:2)
Wrong.
/usr/share/file/magic:
Java Was Strongly Influenced by Objective-C [virtualschool.edu]
NeXT's mach-o & java even share the same magic number:
# mach file description
#
# Since Java bytecode and Mach-O fat-files have the same magic number the test
must be preformed in the same "magic" sequence to get both right. The long
at offset 4 in a fat file tells the number of architectures. The short at
offset 4 in a
Re:spot on (Score:2)
It doesn't matter who Naughton's muse was, Java's object model is close to C++, and very different from Objective-C.
Article is hostile : mc68000 was 32 bit! not 16! (Score:3, Interesting)
the DATA bus and code bus used 16 wires.... b ut it was a goddamned 32 bit chip and this fact used to piss off intel x86 people for many years.
So much so that they try to rewrite history with articles like this crap that ignore that the chip was 32 bits.
A 64 bit processor for example DOES NOT have 64 bit data bus lines typically to the actual motherboard ram, and certainly NEVER EVER offers all 64 bit of addressing. (possibly some offer 48 in this universe though).
but does that mean a 64 bit chip is not a 64 bit? no!! Jsut as the 68K was a genuine 32bit chip and almost no effort was needed when a full 32 bit wired version was offerred for sale.
The article is hostile to history of the mac and lisa.
by the way i bought both the years both shipped.
Re:Article is hostile : mc68000 was 32 bit! not 16 (Score:5, Informative)
68000: 32-bit registers, 24-bit address bus (linear addressing), 16-bit data bus
8088: 16-bit registers, 20-bit address bus (segmented addressing), 8-bit data bus
I frankly don't consider the 8088 and 68000 even remotely comparable - it's far easier to program for (and design hardware around, IMHO) the 68K. The only difficulties that I knew of anyone really experiencing when moving to the 68020 and other full 32-bit variants was that people had gotten into the really bad habit of using the upper 8 bits of the A registers for general storage, which would break things on a '020 horribly. Even so, it was certainly nothing like the EMS/XMS hell that PC programmers had to go through just to use memory above 1MB because of the limitations of the 8088 memory architecture.
Re:Article is hostile : mc68000 was 32 bit! not 16 (Score:2)
No, it makes the rather confusing assertion that:
1) Apple decided that a 16-bit machine was necessary for what they were trying to accomplish.
2) Apple considered many CPUs for the LISA, including the 8088.
So, you see, the article doesn't state that the 8088 is a 16-bit CPU, it just states that Apple considered it despite earl
80386 better than 68000. (Score:4, Informative)
I like the 68000 because it has so many registers but I think all in all in the 80386 is the better CPU.
For reference, consider:
http://www.freescale.com/files/32bit/doc/reports_
http://www.df.lth.se/~john_e/gems/gem0028.html [df.lth.se]
http://linux.cis.monroeccc.edu/~paulrsm/doc/trick
http://www.csua.berkeley.edu/~muchandr/m68k [berkeley.edu]
Right off the wheel, we notice that the 68000 did not support 32 bit multiplecation at all. Doesn't sound too much like a 32 bit chip to me. Compare that to Intels quirky IMUL, which I believe puts the result into EAX, EDX to get a real 64 bit result.
Integer math was faster clock for clock on the 386. Compare things like 68K register addition to Intel register addition. There's no comparison.
Compare
http://www.gamedev.net/reference/articles/article
to
http://www.df.lth.se/~john_e/gems/gem0028.html [df.lth.se]
Whenever you did any 32 bit pointer math on a 68k, you paid a huge, huge performance penalty. It was always more efficient to do things in 16 bit PC relative addressing.
The 68K had no concept of isolated memory or tasks. So systems like the Amiga and the Macintosh would run without any isolation between processes. I was an Amiga fan boy and I used to get that GURU meditation error so much that it was not even comical.
The tragedy of the 386 architecture was actually Microsoft and not Intel. DOS and Windows did not use even the 386 chip to its fullest capability for memory management. MS users would have to wait until Sept 1995, almost 10 years after the 386, for a true 32 bit operating system.
Re:80386 better than 68000. (Score:2)
Re:80386 better than 68000. (Score:5, Informative)
The 68000 came in 1979. The i386 was introduced first in 1986.
The 68020 however, introduced in 1987 did support 32x32->64 bit multiplication and division between all data registers. An external MMU was available, but it was unused by MacOS and AmigaOS.
And the 68000 has had nice relative addressing modes from the start. I don't understand what you are referring to. (I have written machine code for all of these.)
mod parent up (Score:3, Informative)
By the time the 80386 came out, Motorrola had 60020's and perhaps 68030's.
PS the 68020's and I think the 68000's could run Unix because of built in memory protection and other features. Could 8088's, 8086's, 80186's or 286's do that? No I do not Consider early SCO XENIX aka Openserver a real unix with built in memory protecti
Re:80386 better than 68000. (Score:2)
Apple Macintosh was introduced in 1984
Commodore Amiga was introduced in 1985
Compaq 386 introduced in 1986
Re:80386 better than 68000. (Score:2)
Even with 80386 though, PCs were inferior to Macs, because there was no proper 32-bit protected-mode O/S for 80386 (unless you count expensive Unix).
Let's not forget that the 68000 was the CPU used in most coin-ops...the most impressive of which was Outrun and other SEGA super-scaler games. All the SEGA machines used two 68000 plus a group of custom chips to do the graphics job...the 68000 was particularly co-operative with co-proc
Re:80386 better than 68000. (Score:2)
Firstly, there was OS/2 and Windows NT or more esoteric choices like NeXTSTEP. Not to mention that as of Windows/386 (ca. 1988) the OS was becoming "32 bit".
Secondly, MacOS didn't have protected memory at all (until OS X, ca. 2000) and wasn't "32 bit" until System 7, in 1991 (and then only on "32 bit clean" Mac hardware).
In technical terms, DOS-based Window
Re:80386 better than 68000. (Score:2)
Given that NT wasn't introduced until almost 10 years after the Mac, and only a year before Apple went to the PPC architecture, I don't think it really can be considered here. OS/2, while a true pre-emptive multitasking system (and a damn good one IMHO), didn't come out until about the time the second-generation Macs did (1987), and didn't even hav
Re:80386 better than 68000. (Score:2)
The comment I was replying to was talking about the 386, and PCs being "inferior" because "there was no proper 32-bit protec
Re:80386 better than 68000. (Score:2)
Firstly, there was OS/2 and Windows NT or more esoteric choices like NeXTSTEP. Not to mention that as of Windows/386 (ca. 1988) the OS was becoming "32 bit".
OS/2 was in the design stage when the 80386 came out (in 1986). Windows NT was only a thought in the back of Microsoft's mind. NextStep did not run on 80386. There was no Windows/386 fully 32-bit product then.
Secondly, MacOS didn't have protected memory at all (until OS X, ca. 2000) and wasn't "32 bit" until System 7, in 1991 (and then only on "3
Opteron better than 80386 (Score:2)
Re:80386 better than 68000. (Score:2)
This is not correct. Their first choice would have been OS/2, followed closely by Windows NT. Not to mention that as of Windows/386 (ca. 1988), DOS-based Windows was starting to become "32 bit" (was using >640k RAM, could pre-emptively multitask DOS boxes, etc).
DOS-based Windows was always an evolving product. To say "Microsoft users" didn't get any 32-bit goodness until Windows 95 is simpl
Re:80386 better than 68000. (Score:2)
Re:Article is hostile : mc68000 was 32 bit! not 16 (Score:2)
(Emacs, BTW, was a major early offender, because the default assumption it made was that you could use part of every pointer for flags. I think it still does this, rather than defaulting to separate pointers and flag bits except on platforms where the assumption has been tested.)
LISA (Score:3, Interesting)
but it was also always breaking needing service and it didn't get a lot of use..
Lisa was like taking home an attractive woman (Score:2, Funny)
Re:Lisa was like taking home an attractive woman (Score:2)
I disagree with the conclusion (Score:5, Informative)
One of the most maddening things about programming the Lisa was that you couldn't make programs that integrated well with the Lisa office suite. Why? Because there was no API for the GUI. None. If you wanted a window drawn, you fired up QuickDraw and drew it yourself. Want a scroll bar? Do it yourself. Menus? Right.
I ended up only using the development environment's console for my programs' interfaces. The development environment was also console based, probably for the same reasons. A couple of years later, Apple released the Lisa Toolkit that had all that stuff, after they had announced they were going to discontinue it.
So in my opinion, it was the lack of software that killed the Lisa, not its high price. I mean, people were paying for it, and they wanted more. The ability to use proportional fonts was the killer feature to end all killer features.
It's worth noting that Apple learned its lesson about making developers happy - the developer support program for the Macintosh has been one of the best.
Re:I disagree with the conclusion (Score:2)
I was just reading an article that took a different view with respect to video and video hardware, claiming that Apple provides NO support for a third party to make their own video card drivers.
Re:I disagree with the conclusion (Score:2)
Sure, Apple had it. Didn't do much good for the rest of the third-party developers.
Comprehensive Lisa info at guidebookgallery.org (Score:4, Interesting)
Re:Comprehensive Lisa info at guidebookgallery.org (Score:2)
commentary is off-base (Score:4, Informative)
Claims that the Lisa represented significant technological innovation seem dubious to me. You need to compare the Lisa to the totality of R&D efforts around at the time, not just the Star. Xerox alone had Alto, Star, Smalltalk, and probably others. The GUI of the Lisa was an evolutionary change, and not always for the better; what was under the hood of the Lisa can charitably be described as pedestrian. It took Apple 20 years to catch up and finally adopt system software that even is in the same league as Smalltalk-80 (that's "80" as in "1980"; Smalltalk-80 is the language and platform that Objective-C and Cocoa are modeled on).
Lisa's main significance was to be a prototype for, and cannibalized for, Macintosh (and it served as the main development machine for Macintosh apps for a while), but I can't think of any significant new technology it introduced.
Re:commentary is off-base (Score:3, Insightful)
Alto was, from ev
Re:commentary is off-base (Score:2)
I think you underestimate how slow the Lisa was, the GUI was of no importance when waiting for it to do anything was an exercise in frustration.
TWW
Re:commentary is off-base (Score:2, Informative)
Ummm.... menus?
What is an "object oriented UI"? (Score:4, Funny)
Re:What is an "object oriented UI"? (Score:2)
Dan Smith, a major contributor to the desktop interface, had created the Lisa's first interface, the Filer. The Filer asked a user a series of questions about what task the user wanted to accomplish, and when enough questions were answered, the tasks were executed and the Filer shrank to the background.
Re:What is an "object oriented UI"? (Score:3, Informative)
OS/2's Workplace Shell is generally considered the best example.
Remember, the original Mac didn't sell well either (Score:5, Informative)
The lack of a hard drive was the killer. By the time the Mac came out, IBM PCs had a hard drive, so Apple was playing catch-up. Apple had tried building hard drives (the LisaFile), but they were slow and crashed frequently. But at least the Lisa had a hard drive. Third parties added a 10MB hard drive to the Mac [atarimagazines.com] in early 1985, which brought performance up to an acceptable level. Some people say that third-party hard drives saved the Mac. But Apple fought them tooth and nail. Apple finally came out with a 20MB external hard drive for the Mac in 1986. This was very late; IBM PCs had been shipping with hard drives for five years.
Sales for the Mac were well below expectations. Apple had been outselling IBM in the Apple II era. (Yes, Apple was once #1 in personal computers.) In the Mac era, Apple's market share dropped well below that of IBM.
What really saved the Mac was the LaserWriter, which launched the "desktop publishing" era. But that required a "Fat Mac" with a hard drive and 512K. By then, the Mac had reached parity with the Lisa specs, except that the Lisa had an MMU and the Mac didn't. The Lisa also had a real operating system, with protected mode processes; the Mac had "co-operative multitasking" in a single address space, which was basically a DOS-like system with hacks to handle multiple psuedo-threads.
The MMU issue was actually Motorola's fault. The 68000 couldn't do page faults right, and Motorola's first MMU, the Motorola 68451, was a terrible design. The Lisa had an Apple-built MMU made out of register-level parts, which pushed the price up.
Apple might have been more successful if they'd just stayed with the Lisa and brought the cost down as the parts cost decreased. They would have had to push Motorola to fix the MMU problem, but as the biggest 68000 customer, they could have.
The Mac's other salvation: square pixels (Score:5, Informative)
Re: (Score:2)
Re:The Mac's other salvation: square pixels (Score:3, Interesting)
Some corrections (Score:3, Informative)
Internal hard drives didn't come 'til, I want to say Mac II? Was there one for the SE?
Re:Some corrections (Score:2)
The SE/30 had an internal hard drive, but was produced after the Mac II and was really just a Mac II in an SE case.
Re:Some corrections (Score:2)
There was a Mac SE FDHD [everymac.com], but that wasn't until 1989. The Macintosh Plus [everymac.com], released in 1986, was the first Mac with a SCSI port, and Apple sold a matching hard drive. That was the first Mac with a supported hard drive. There had been previous third-party attempts to add a hard drive, but they either required internal mods or worked, slowly, through the printer port.
Multifinder originally was an add-on for System 4. I
document centric... (Score:2)
today its to much focus on what apps one use, not what one want to do.
i all to often see people having photoshop installed when all they do is look at digital photoes...
Re:document centric... (Score:2)
For example, I open PowerPoint and I get a Window with a bunch of document templates on it. If I select Excel Workbook, Excel opens up with a blank workbook and PowerPoint goes into the background.
On the Lisa, you have a folder call Stationary that has blank copies of documnents. You just double click on one and the appl
Re:document centric... (Score:2)
Back ? UIs have been moving more and more away from being application-driven and towards being document-driven for the last 25+ years. Methinks you never really used computers before the 90s :).
today its to much focus on what apps one use, not what one want to do.
I think you've got that backwards. Interfaces today are *very* much document-centric, not application-centric.
That many users treat them mostly as application-centric, is no
Another good site is (Score:2, Informative)
Another good site full of first-hand descriptions of how early Apple development was done is http://folklore.org/ [folklore.org].
I've never owned a Mac, and am too young to have been involved in earlier developments - but that site does make it all seem very impressive.
Interesting ad (Score:2)
http://www.jagshouse.com/images/lisa4.jpg [jagshouse.com]
The real world images above the icons look like the photorealic icons used in OS X (no shit, yes they're real, but i.e. the lighting, camera angle, etc.)
Re:I for one welcome... (Score:2, Interesting)
Re:I for one welcome... (Score:3, Insightful)
Actually, he's just some damn skript kiddie. They ALL talk like that.
Pathetic, actually.
Re:I for one welcome... (Score:2, Funny)
Re:Oh Please (Score:4, Insightful)
Re:Oh Please (Score:2)
Re:Oh Please (Score:4, Insightful)
An yes, the Lisa shows paths we did follow as well as some we didn't. The whole idea of centering document creation on templates at the GUI level is very interesting and should warrant further investigation. Hope Gnome and OpenOffice folks think about it.
Re:Oh Please (Score:2)
Besides Word and many other application allow you to save templates, and not all applications fit the template model anyway.
Re:Oh Please (Score:2, Funny)
Re:You are a Moron (Score:5, Interesting)
About a year after the formal release of the Mac, I called up Apple's service people looking for a replacement gate array chip for an Apple
And the parent poster isn't too far off base in one respect. Apple's market share did drop, not so much because of the move from DOS3.3/ProDOS to the Mac, but because of the way Apple treated their existing base of Apple ][ and
Never forget one thing: Apple was the incumbent, with all the advantages that confers. Atari and Commodore together couldn't market themselves out of a cardboard box and both eventually fell by the wayside. I look back at Apple Computer as a company that was at the right place at the right time with everything in its favor, only to squander opportunity after opportunity, relegating itself to second place. And such a distant second as to be almost out of the running, when they could have owned the market. They do seem to be making some good moves lately: let's see if they can keep it up.
Re:You are a Moron (Score:3, Informative)
Actually, the original PC BIOS wasn't so much reverse-engineered, as it was simply duplicated. IBM published the full annotated assembler listing in the original IBM PC technical manual.
Yes, that was done purposely in the hopes that its existence could be used to quickly shut down any cloners via copyright infringement lawsuit. The grandpare
The Apple II: What might have been? (Score:2)
Never forget one thing: Apple was
Re:The Apple II: What might have been? (Score:4, Insightful)
Re:You are a Moron (Score:2)
Re:You are a Moron (Score:2)
Re:You are a Moron (Score:2)
Re:Where are they now? (Score:2)