Forgot your password?
typodupeerror
This discussion has been archived. No new comments can be posted.

GUI Revolutions: From Flashing Bulbs To Windows 8

Comments Filter:
  • Is that reality is never as good as possibility, because any idea will end up being moulded for the personal gain of a particular business or government. Whether it's lock-in on the desktop or sending your information off to the cloud, we'll never see a decent peer-to-peer collaborative system as long as humans are designing, building, deploying and maintaining it.

    • we'll never see a decent peer-to-peer collaborative system as long as humans are designing, building, deploying and maintaining it.

      That's why, come singularity, robots will win. ~

    • by BitZtream (692029)

      I thought one of the battle cries of RMS was that OSS fixes all those problems ... No ones personal agenda matters cause everyone can make it like THEY want it!

      As long as greedy people are designing, building and deploying it, yes, it'll be the way it is.

      We've seen in the past some alternatives done by people without greed on their mind (at least in the beginning) which then promptly get offered massive sums of money that only an idiot would turn down in exchange for selling out to some big company ... whi

      • You've combined some reasonable obervations with some unsound conclusions.

        We've seen in the past some alternatives done by people without greed on their mind (at least in the beginning) which then promptly get offered massive sums of money that only an idiot would turn down in exchange for selling out to some big company ... which then turns it into exactly what you say.

        Yes, except that there is the occasional "idiot" who doesn't sell out. They are the people you don't hear about. By keeping the world talking about the big winners (by some moulded definition of winner), you keep people chasing the dragon. Unfortunately, the guy who remains true to himself tends to get obsoleted by an inferior copy with better marketing, so either way you're fucked :-).

        Anarchy doesn't work in efficiently enough to be useful beyond a small size, I challenge you to show me an example of it working.

        Trying to engineer a system where power remains a

      • by mattack2 (1165421)

        Wikipedia is more or less a failure, its not useless, but no intelligent person goes to wikipedia to get facts since its basically in constant anarchy at this point. You might go to get an idea of what something is, but thats where it ends.

        A failure? I think it's a huge success. "Getting an idea of what something is" is a great resource. Do I trust it absolutely? No, but I do at least sometimes (when I'm trying to get more than a basic idea about something) follow the citations, or do more research sep

      • Wikipedia is not working? This is so 2005. Have you been there lately? Wikipedia works very well indeed because it's reached a critical mass of "social seriousness" that experts want to partake . What keeps us from going extinct is our diversity? That is so 50 million years BC. What keeps us from "going extinct" is we haven't unleashed nukes and we haven't destroyed the environment, yet If either of those two things happen, then it's not going to help that we're "diverse". Everyone gains from the scien
  • There were many. (Score:5, Insightful)

    by bmo (77928) on Thursday June 09, 2011 @09:34AM (#36387154)

    This article doesn't scratch the surface, and looks more like an advertisement for Windows 7 and 8.

    http://web.archive.org/web/20100101033213/http://toastytech.com/guis/index.html [archive.org]

    There's history for ya.

    Wayback Machine mirror so as to not nuke the poor guy's site.

    --
    BMO

    • by Fri13 (963421)

      +1

      To sum all up

      "The mother of all demos"
      Xerox Star
      Windows 1.0
      Microsoft Bob
      Windows 95
      Windows XP
      Windows 7
      Windows 8 ....
      The future from Microsoft?

      Conclusion: Cheap ad for that site.

    • The title alone makes it look like an advertisement for Windows - colour me unsurprised :) If it had mentioned an OS that was already out then it would have been less of a clue.

      Everything I've heard about Windows 8 so far make it sound like it's very touch-screen centric. I think they're missing the point completely if they're abandoning traditional desktop paradigms altogether. By all means make a version of Windows that is designed for tablets, but don't force that UI on everyone else. It'll end up being

      • It'll end up being worse than Unity.

        Wait - that's possible?

        (.../me gets shown image of WP 7 Metro UI...)

        Urgh. I take it all back.

        You do bring up a good point, though.

        A one-size-fits-all UI is like trying to find a nubile cutie fresh out of college who can calculate quaternion rotations in her head, thinks emacs sucks, wants to marry a typical slashdotter and have his babies, but at the same time loves hunting, fishing, and, oh BTW - she's a billionaire.

        In other words? Not going to frickin' happen. Too many damned use cases out there to credi

        • Believe it or not, I'm dating a girl who can do most of those things, but is not a billionaire. And in full disclosure has been out of college for two years and had to be converted to the vi camp by yours truly.
        • (.../me gets shown image of WP 7 Metro UI...). Urgh. I take it all back.

          If you haven't seen it in action, you should understand that images don't do it justice. It looks different enough from conventional icon grid iOS/Android/whatever UI that it's not obvious how this thing is supposed to work, but actually trying to use it usually drives the point home.

          There are plenty things wrong with WP7 at the moment (lack of tethering is a killer right there, and we didn't even get to apps...), and I'll take my Android phone over it any day; but UI isn't one of those things...

        • by sgt scrub (869860)

          Not going to frickin' happen.

          Destroyer of dreams!

        • by BitZtream (692029)

          A one-size-fits-all UI is like trying to find a nubile cutie fresh out of college who can calculate quaternion rotations in her head, thinks emacs sucks, wants to marry a typical slashdotter and have his babies, but at the same time loves hunting, fishing, and, oh BTW - she's a billionaire.

          Turns out, you just described my wife almost to the letter, those she isn't a billionaire and we didn't find out until after we were married that she liked hunting and fishing. Had a deal her father was working on not fell through, its quite probably should would have been a billionaire too! So its not impossible, but it certainly is highly improbable.

    • by cpu6502 (1960974)

      >>>looks more like an advertisement for Windows

      Yep.
      According to this history, nobody existed in the personal computer market (i.e. home) except for Apple and Microsoft. Other significant companies like TI, Atari, and Commodore did not matter. I mean... Atari merely created the idea of a multimedia computer (one that has music-quality sound and graphics) in 1979. Commodore merely invented the idea of preemptive-multitasking and parallel processing (between SPU, GPU, and CPU).

      But they don't matter

      • by wootcat (1151911)

        To be fair, the article's purpose was to focus on the development of GUI. Multimedia and preemptive-multitasking don't really fall under that category. But you are right in that the article doesn't cover other important GUI advancements, such as Amiga's contributions or even application-level improvements such as the ability to select a block of text and drag it to another location of the document, dynamically shifting the text as the block is moved.

        The article does get points for even mentioning GEM, but i

      • Re:There were many. (Score:4, Informative)

        by starfishsystems (834319) on Thursday June 09, 2011 @01:20PM (#36390540) Homepage
        There's also a total neglect of the X Window System development at MIT, not to mention the various Lisp Machines and their graphical user interfaces which, drawing on the truly foundational work conducted at PARC and elsewhere, further explored the GUI paradigm and established some of its practical limitations.

        The importance of building practical systems to test principles of human-computer interaction cannot be overemphasized. The early work by Doug Engelbart, Alan Kay and others was both innovative and empirical, but it dealt with various components of the GUI in isolation. Only by building a complete GUI system and putting it in front of a lot of people could we learn which elements were most successful and in what combinations.

        For example, one of the ideas being particularly explored in the Lisp community at this time was how and to what extent the underlying objects should be manipulable through the GUI. Graphical copy-and-paste was a new but easily accepted idea. The obvious question, then, was whether such operations would do better to copy a representation of the object or the object itself. This parallelled a similar debate about the design of Lisp editors: whether these should be text editors in the spirit of Emacs or object editors which happened to offer a text representation. If I copy and paste a graphical representation of a file on the screen, under what conditions should that copy the file contents, the file itself, a link to the file, or the name of the file?

        The answer, if you were to ask Microsoft or Apple at that time, would be equivalent to Henry Ford's "You can have any color you want as long as it's black." The Unix and Lisp world, meanwhile, were much more exploratory. No huge revelations come to mind, but in an incremental way it was these communities which established many of the GUI conventions we take for granted today. What has followed thereafter, for the most part, is merely eye candy.
      • by mattack2 (1165421)

        The truth is that Apple/Microsoft computers of the 1980s were bland and uninteresting (unless you enjoyed 4-color graphics and sound that went "beep")

        BUZZ. Ensoniq chip built into every Apple IIGS that I believe is the same chip that PC users were paying lots of money for on a Soundblaster card.

        Also, much more than 4 color graphics.

    • by Hatta (162192)

      Coral Cache is a much better option for that kind of thing. The Wayback Machine is designed to preserve history, not buffer peak bandwidth. Coral is faster and will be more up to date than the Archive.

    • by sconeu (64226)

      Not to mention that the submitter is blogwhoring himself.

  • by WrongSizeGlass (838941) on Thursday June 09, 2011 @09:34AM (#36387166)
    I don't see mobile GUI's in there. Surely Palm's early offerings qualify as a "before" and iOS & Android as an "after" ... and Magic Cap as in "in between". Desktops aren't the only place we use GUI's.
    • by drinkypoo (153816)

      The article is garbage at best and a paid advertisement at worst. When there is an article on the development of GUIs that doesn't include the word "Motif" you know it is shit. Microsoft being on the Motif WG explains so much about Unix user interfaces...

  • âoeWhat I saw in the Xerox PARC technology was the caveman interface, you point and you grunt. A massive winding down, regressing away from language, in order to address the technological nervousness of the userâ.
    • by operagost (62405)
      Vi rules, eh?
      • Vi rules, eh?

        vi has nothing on ed.

        user@host:~$ ed

        ?
        ^C
        ?
        quit
        ?
        q
        user@host:~$

        Now there's a real man's editor.

      • by Arker (91948)
        Slashcode is utterly abominable. Every revision, they not only make the place uglier, they introduce new bugs as well. The last redesign appears to have totally borked utf input.
  • Spellcheck (Score:1, Funny)

    by SJHillman (1966756)
    "Douglas Englebart was a true visionary. On a single conference on Deceber 9, 1968, he performed a live demonstration that showed working prototypes of a computer mouse, hypertext, email, word processor, and collaborative real-time editor." However, you will note the lack of a working spellcheck. Or else Deceber is a month that only existed in the 60's.
  • by Jeremy Erwin (2054) on Thursday June 09, 2011 @09:46AM (#36387302) Journal

    Our brain is well suited to work with visual clues, and computers soon learned to use that.

    What will computers think up next?

    • 'Poorly written' is exactly what I was thinking. There are so many errors it would be a waste of time to point them out.
  • by sakdoctor (1087155) on Thursday June 09, 2011 @09:48AM (#36387334) Homepage

    In future, all screens will be touchscreen, even your main PC monitor. The major breakthrough was the self-lubricating touchscreen. It's naturally oily, and hypo-allergenic, requiring no cleaning.

    Of course, the mouse driven paradigm needed to be scrapped completely, in favour of a adult finger-painting gesture system. Mod someone down on slashdot? There's a gesture for that. There's an intuitive gesture for absolutely everything. Just install the gesture localization pack.

    True, I can't find any of my LOCAL applications any more, but that's fine because I can just google for them, and they'll turn up some place.
    It's going to be a good future.

    • What does the future hold?

      It better not be fucking useless, gimmicky hand gestures

    • by bob8766 (1075053)

      Mod someone down on slashdot? There's a gesture for that.

      And the best part is that it's intuitive and universal. I use it a lot when I'm driving

    • by Swiper (1336263)
      Could you explain what you mean by "Local" please? It seems to belong to an long disused set of words, also featuring such dinosaurs as "private data"....
  • That's almost hysterical. If anything, windows 3.1 was revolutionary but that's only in the Microsoft context.

    • by drinkypoo (153816)

      Windows 3.0 was evolutionary, and it evolved in parallel with Unix GUIs since Microsoft was involved with Motif. This is why for many years you could sit down at either a Windows machine or a typical Unix machine, whether it came from IBM, Sun, or some other source, and apprehend the basic windowing functions.

      The next GUI revolution is in reality overlay. Ideally you all but eliminate any interface but pupil tracking, voice, and gesture. A small device (like a cellphone) has enough interface surface. Gestur

      • you could sit down ... and apprehend the basic windowing functions.

        Is that like in the movie Tron? "You're under arrest, basic windowing function."

    • by Zediker (885207)
      Its an evolution, but only time will tell if it is a revolution... I'm not particularly betting on it being one.
    • by Anonymous Coward

      Try "from flashing bulbs to iOS", or "from flashing bulbs to Android". If you're searching for the modern pinnicle of GUI evolution, the desktop GUI ain't it (especially windows, released or unreleased). The desktop GUI was perfected 10 years ago, and nearly every "improvement" since then has been driven by the developer's vision rather than the user's need.

  • One of the IBM technicians wrote about Lisaâ(TM)s OS: âoeWhat I saw in the Xerox PARC technology was the caveman interface, you point and you grunt. A massive winding down, regressing away from language, in order to address the technological nervousness of the userâ.

    What was true then is true today. No GUI comes close to matching the expressive power of the command line. GUIs are still a silly prop for kids.

    • by gstoddart (321705)

      What was true then is true today. No GUI comes close to matching the expressive power of the command line. GUIs are still a silly prop for kids.

      It depends on what you're doing, really. For a lot of tasks, I actually find a GUI to be well suited.

      However, I've also copied files from a Windows machine to a UNIX machine as recently as last week so I could do a little command line grep/cut/sed magic on them and produce something else. For cajoling text into a new form, a command line is still the best thing ev

    • by Anonymous Coward

      Pardon me. I didn't realize I was on your lawn ...

    • by Richy_T (111409)

      I think it's time for a hybrid next-generation command line/gui system. There's a little crossover already with x-term windows, mouse actions in terminals, "screen" and a few other things but a comprehensive top-down approach might yield real dividends.

      • by mattack2 (1165421)

        Commando in MPW put a GUI on commands, sort of.. letting you create the CLI command with the GUI. Mostly more of a pain than it was worth, but useful the first couple of times you used a new command perhaps...

        Also, I think something like OpenDoc is(/was) the closest thing to the CLI concept in the GUI world -- lots of little programs that interact with each other. It's a shame that none of the component-based systems have become popular/are still around. If I could take a GUI program that I like all but o

    • "GUI operations are essentially impossible to script. With large numbers of servers, it is impractical to use the GUI to carry out installation tasks or regular maintenance tasks."
      - David Brooks [theregister.co.uk], Microsoft
      • by BitZtream (692029)

        Perhaps someone should buy him a Mac and show him AppleScript.

        Are there any native Mac OSX apps that aren't scriptable? I don't even know how you could make them unscriptable, its kind of built into ... well, everything in OSX.

        You don't have to do anything to make your app scriptable, though you can make it easier to script for by making helpers and such, but out of the box your apps are scriptable because the core runtime libraries are scriptable.

        Maybe he should take a look in his own company at VBA, whic

    • by BitZtream (692029)

      Really? I find editing raster images and viewing them from a command line to be rather shitty, but maybe you use a different editor than I?

      Perhaps you have matrix like skills so just reading hex allows you to visualize the image, I do not however, so I tend to stick with using pretty GUI editors.

      I'll be happy to bet a years pay that I can come up with at least 50 tasks that you simply can not under any circumstances do better at a command prompt than I can do at a GUI.

      Like wise, I could do the same for the

  • by Ltap (1572175) on Thursday June 09, 2011 @09:58AM (#36387458) Homepage
    Oh, so it looks like we went from blinkenlights to terminals to Windows without stopping, and any form of interface other than that is either irrelevant or obsolete. I guess they actually consider it the end-all and be-all.
  • by Junta (36770) on Thursday June 09, 2011 @10:03AM (#36387506)

    Task switching without hint as to how much further to the task you are actually looking for, only allowing non-overlapping windows. It's essentially Windows 1.0 on those fronts.

    Microsoft saw iPhone acheieve apparent success making a giant phone, and MS wants every desktop to be that way. Further making things worse, they are ignoring the market reality and declaring WP7 the most awesome interface for phones and giant phones.

    • I love how short demos of a new feature cause people to draw so many conclusions.
    • by tgd (2822)

      A couple points:

      - One video doesn't show you how a UX works
      - A company like Microsoft will not change the UX on software used by a billion people without cold hard facts that its better for most of them. The scale that Microsoft invests in UX analysis and testing dwarfs what even big software companies spend on their software in general.

      Its a pretty strong statement to make, from someone who has neither used it, nor done any usability testing themselves, to declare that its a big step back.

  • Am I the only person who would actually prefer the Windows 3.1 interface to still be around today? No more "close next to maximise", a nice "desktop" that you can organise how you like, and in subfolders, without things popping up at random places on the screen, and no Start Menu / Taskbar / Quick Launch horror, and everything taking precisely as much effort to draw as absolutely necessary (no gradient title bars, horrid skins, etc.).

    There was something sweet, simple, endearing and DAMN FAST about the 3.1

    • "Am I the only person who would actually prefer the Windows 3.1 .. a nice "desktop" that you can organise how you like .. without things popping up at random places on the screen"

      A combination of Novell Netware and Xtree [xtreefanpage.org] done for me or even Midnight Commander [wikipedia.org]

    • by drinkypoo (153816)

      Try matchbox window manager. Or maybe fvwm with thunar.

      Making linux (or whatever) really fly is pretty easy, you just rip out everything really nice, like udev for example. It's an option if you really need the speed and can't afford a little more hardware.

      I know for my part that I'm suffering mostly for not having an SSD, which I suspect would fix everything wrong with my computing experience, from my point of view.

    • You could just run Windows 3.1. It does still work on a modern pc.

    • by Osgeld (1900440)

      lol considering 200MHZ machines are about 10x faster than the machines it was designed to run on, I sure fuking hope it flew!

      Hey ma, OS/2 really flies on this 2GHZ AMD64!

    • by BitZtream (692029)

      There was something sweet, simple, endearing and DAMN FAST about the 3.1 shell that I haven't found anywhere since. It flew even on 200MHz machines.

      Considering when it was written, it flew on 16 mhz machines, I would hope it could do well on a 200 mhz machine.

      I haven't looked in Windows7, but I'm pretty sure you could still use fileman.exe as your shell in XP if you really wanted to go back to the dark ages you could, but its unlikely you'd actually stay that way for any length of time.

    • by narcc (412956)

      There was something sweet, simple, endearing and DAMN FAST about the 3.1 shell that I haven't found anywhere since. It flew even on 200MHz machines.

      I sure hope it was fast on a 200mhz machine! I could run Doom in a window on my 66mhz IBM Aptiva with 8mb of RAM.

      With today's hardware, computers should be unimaginably fast. There is really no excuse for the slow bloated crap we have today.

  • Because I can't wait to get fingerprints all over my monitor. . .

  • by Lord Lode (1290856) on Thursday June 09, 2011 @10:15AM (#36387666)

    An article supposed to present a huge history of GUI development, which has "Windows 8" in the title a few days after it was demoed for the first time? Sounds like the article will be something thrown hastily together to jump on the "hype" bandwagon rather than an insightful article about history...

  • A good Slashdot article would be the history of the title of this article and who got paid to create and spread it.

  • even if it is an article regarding the evolution of the windows gui, truncating the gui history from engelbart and parc to the original mac os, and then switching to the history of the windows gui is pure horseshit.

    windows is what it is today due to the development across many windowing and gui efforts.
    microsoft has (often blatantly) borrowed gui metaphors from many of its contemporaries thru several iterations of windows including:
    motif(cde) - expand/minimize/destroy window
    openlook - WIMP metaphor
    aqua - tr

    • by dubbreak (623656)
      At least they got Engelbart right, but as per usual they attribute they start at Xerox PARC not at SRI. SRI is where the mouse was invented, SRI was where scrolling windows were first done, SRI is where all the work that went into the "mother of all demos" [sri.com] occured.

      Once the government funding that paid for the research at SRI dried up the researches were picked up by Xerox.
    • motif(cde) - expand/minimize/destroy window

      Given that Motif came out in 1989, a year after Windows 2.1 which had all three [wikipedia.org], this interpretation of history implies the existence of a time machine.

      Well, I guess now we know what all the billions dumped onto MS Research are actually spent on.

    • by BitZtream (692029)

      Neither Aqua nor Compiz were started before MS added those features to the OS ... no one used them because performance sucked as the OS didn't make a decent video card a requirement, and at the time, those cards were rather expensive.

      Sorry, but Compiz is nothing but copies of other peoples stuff for the 'oh shiny' factor, much like all of MS's recent stuff (WinPhone 7 and Win8) are copies of Apple's stuff, without understanding WHY they did it. Both are examples of doing it wrong, regardless of what they w

  • by sootman (158191) on Thursday June 09, 2011 @11:30AM (#36388630) Homepage Journal

    ... but wow, what a fanboyish piece of shit. There is nearly no mention of Apple after its origin.

    Leading into Windows 1 (after talking about Xerox, the Lisa, and the first Mac) he says "The era of GUI's was about to start. But apple [sic] was not meant to be the king."

    Oh really? [macdailynews.com]

    - Vista copied many features straight out of Tiger [youtube.com]
    - I think we can all agree that WP7 would not look like it does if the iPhone had never been on the scene
    - And now, after ten years of making poorly-selling tablets, Apple has shown how it should be done [paidcontent.org] and MS is falling over themselves trying to catch up

    I'm not saying Apple has never copied anything either, but once the article hits Windows 1.0, it is all about MS. He goes from Windows 3 to Microsoft Bob, lays down exactly 10 words about Windows 95, then goes straight to XP, Vista, and 7. He dismisses over two decades of Mac OS with the words "In the meantime, Mac OS was undergoing a similar, slow evolution."

    He then says "Last couple of years were really eventful. New families of computing devices became wildly popular -- smartphones, netbooks, tablets. Mobile operating systems became almost as complex and capable as desktop ones. Multi touch technologies challenged the age-old interface design, and required new approaches. And now Microsoft tells us the future belongs to tiles." and the rest of the article is about Windows 8 and tiles. REALLY? No mention at all of the iPhone, who was the first to market with multitouch, even if they didn't invent it? No mention of Palm, or WinCE or BeOS or the Amiga or a million other omissions? Come on. If he isn't a shill, he's got a BIG set of blinders on. If you want to see the history of GUIs, go here. [toastytech.com] They have a ridiculously thorough collection of screenshots.

  • Sad, a straight shot to Windows 8 - Xerox had the idea, Apple Copied it, MS copied it and then MS developed it into Windows 8 - without copying any thing else... really?

  • "GUI Revolutions: From Windows 8 to Flashing Bulbs" - there, fixed it for y'all :)

  • FTFA:

    Today we take GUI’s for granted, but back when they were starting up, some people actually saw them as a silly prop for kids. Real men were supposed to use the command line.

    Please. Real men still use the command line. That's how I browse Slashdot!

  • by Animats (122034) on Thursday June 09, 2011 @01:52PM (#36391044) Homepage

    The article re-hashes the obvious.

    There's a whole history of early graphical user interfaces from the pre-computer and early computer era.

    One of the neater ones was the Panama Canal lock control boards [zuschlogin.com], built by General Electric in 1913. This was a long desk with a symbolic model of the locks. The water level in each lock is represented by the tall indicators. The lock gate positions are represented by aluminum pointers. The protective chain lifted into position to protect the first lock gates from a runaway ship was represented by a little metal chain. The locks themselves are represented by a long strip of blue-grey stone. (The first GUI theme!) The valves are controlled by water faucets, and the gates by handles.

    All this is interlocked mechanically, so, for example, that the lock gates can't be opened unless the water levels are equal on both sides. The handles will physically not turn. That technology was borrowed from railroad signalling.

    Another system of historical interest is General Railway Signal's NX interlocking system. [rrsignalpix.com], from 1936. This is the very beginning of "user-friendly" GUIs. Previously, interlocked systems in railroad signalling, and the Panama Canal system, just prevented the operator from doing prohibited operations. NX was the first system which showed the operator all the currently valid options, let the user select one, and took care of the details of making it happen. It's well worked out. The operator selects the entrance point where a train is entering the interlocking. The system figures out all the currently valid exit points, taking into account other trains currently present, conflicting routes, etc., and lights up illuminated buttons on the track diagram for each currently allowed exit point. The operator then selects one exit point. The system then moves all the track switches as necessary, waits until they're set and locked in the correct position, then sets the signals along the route to clear. As the train passes through the interlocking, the signals change to "stop" behind it, and the track sections and switches are automatically freed up for other trains. At all times, there's at least one stopping distance of red-signaled track between any two trains, and any switch in a green-signaled section cannot be moved until the train clears it. The New York City subway system still uses this technology, along with mechanical train stop devices at every signal which, if up, will hit an air valve on each subway car and stop the train. There's a simulator [nycsubway.org] if you're interested.

    It's worth understanding the big display-board systems of the past. Many of them had better human interfaces than modern systems.

  • FTA:
    Also, the ARM compatibility is a double edge knife â" sure, Windows 8 will run on more machines, including mobile devices. But the applications themselves wonâ(TM)t be interchangeable, software written for ARM wonâ(TM)t work on your PC. Whatâ(TM)s the point in calling it one system, when effectively you will have two systems, each with a separate set of compatible apps?

    True on one hand (apps like Office), but not true on another. The statement was made that apps that run entirely on

  • Windows 1 had xclock?

  • What does Windows 8 have to do with the evolution of the GUI? It looks like an irrelevant, cumbersome side-branch to me.

  • On reading about this subject, I have come to appreciate the evolution that has occurred for GUI applications. I do however, believe that one should not overlook the following about me and my peers, "We baby boomers are not going to give up requiring large fonts, or keyboards for gui applications". A mouse is essential if one intends to do fine drawings such as autocad, or use draw programs or even produce powerpoint types of foils where fine line detail is needed.

    I purchasd a keyboard with about 10 ext

Parts that positively cannot be assembled in improper order will be.

Working...