Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Technology

Alan Kay Decries the State of Computing 479

gnaremooz writes "Computer pioneer Alan Kay (DARPA in the '60s, PARC in the '70s, now HP Labs) declares 'The sad truth is that 20 years or so of commercialization have almost completely missed the point of what personal computing is about.' He believes that PCs should be tools for creativity and learning, and they are falling short."
This discussion has been archived. No new comments can be posted.

Alan Kay Decries the State of Computing

Comments Filter:
  • Arrgh.. (Score:4, Interesting)

    by Defiler ( 1693 ) * on Tuesday July 13, 2004 @08:03AM (#9685058)
    Another computer visionary with vague promises and criticisms.
    Instead of doing [insert clearly-defined practical thing here], you should be doing [insert vague semi-buzzword here, like "education", or "object"] and you should be using [insert visionary's product here] to do it.
    • Re:Arrgh.. (Score:5, Insightful)

      by RevAaron ( 125240 ) <revaaron@hotmail. c o m> on Tuesday July 13, 2004 @08:11AM (#9685125) Homepage
      Another computer visionary with vague promises and criticisms.
      Instead of doing [insert clearly-defined practical thing here], you should be doing [insert vague semi-buzzword here, like "education", or "object"] and you should be using [insert visionary's product here] to do it.


      Not quite...

      While people are certainly welcome to disagree with Kay's vision, he's not in the same barrel of monkey that most so-called visionarise and pundits live. Unlike most of those, he's implemented those ideas, and has been spent implementing those- in real, live, usable code- for the last 30-some years. Kay doesn't have a product, he's got nothing in a box to sell. He does have an idea to sell, though you don't pay for it with your money. He's been doing it in a very practical way for 30 years, not just making vague promises.
      • Re:Arrgh.. (Score:2, Interesting)

        by Allen Zadr ( 767458 ) *
        It's a great vision, but in a world where every single computer is expected to have a firewall - Peer-to-Peer computing -- worldwide -- isn't going to happen.
        • Re:Arrgh.. (Score:5, Insightful)

          by Zeinfeld ( 263942 ) on Tuesday July 13, 2004 @09:16AM (#9685768) Homepage
          It's a great vision, but in a world where every single computer is expected to have a firewall - Peer-to-Peer computing -- worldwide -- isn't going to happen.

          Thats only true if you insist that the messages that pass between the computers have to be executable code. In the real world I don't think that is necessary or desirable.

          This was actually the subject of a long conversation Uri Rabinski and I had with Alan he spoke at the Darmstat WWW conference. Alan had been pushing the idea that PDF was a better model for information interchange than HTML because in PDF the content was encapsulated with the code that interpreted it and gave it semantics. Tim Berners-Lee later joined in the conversation but did not get any further with Alan than Uri and I.

          Needless to say I did not agree with this idea, and at the time it would be impossible to move PDFs arround as the core of the Web since they are typically five to ten times the size of the equivalent HTML and a fast modem was 28.8Kb/sec. But at a more fundamental level, with HTML google is possible, with PDF you are reduced to screen scraping technologies. HTML can render well to almost any output device (or rather could before being bastardized by netscape) PDF renders badly to anything other than paper the same size as the original rendering.

          If you exchange declarative statements rather than programs firewalls don't represent a barrier. This is exactly what we have in the biological world (which Alan had used as analogy), cells do not accept raw DNA from the outside and run it. Viruses have to bypass these defenses.

          I am not sure what Alan is up to here, the person who wrote the article clearly has a much less good idea of what Alan is up to than Alan.

          Sure there are problems with most software. Word sucks, as do most HTML editors, despite all the pretty graphics sloshed into HTML there are still no good tools for producing printed output. Open source alternatives suck even worse, we get a bad copy of Word and several bad HTML editors. Same for Excel and spreadsheets.

          If Wolfram had spent the last ten years doing something more important than writing a book that claims he is the modern Newton, mathematica might have gone somewhere interesting. Unfortunately it has gone from being a niche market tool for scientists to being a niche market tool for scientists and some engineers.

      • Re:Arrgh.. (Score:3, Insightful)

        by Defiler ( 1693 ) *
        I'm not belittling Mr. Kay's work.. Obviously his contributions have been significant. However, the ideas that are actually expressed in this article (not the ones that were old news in 1985) seem entirely vague and "catty". He claims we haven't done anything interesting with PCs in the last week. Arrogance.
        He does have a product.. He has his reputation as a visionary. In his line of work, that's more important than any software application or widget.
        His example: A software package that just looks like
        • Re:Arrgh.. (Score:5, Interesting)

          by Pig Hogger ( 10379 ) <pig.hogger@gmail ... m minus caffeine> on Tuesday July 13, 2004 @08:43AM (#9685426) Journal
          His example: A software package that just looks like the modern equivalent of LOGO. Interesting, sure. Probably lots of fun to play with as a child. More compelling that e-mail or Wikipedia? Please.

          Don't belittle Smallalk . It ain't. Case in point: some years ago, a friend of mine had the misfortune of having sold beaucoup computers and servers to an ailing airline, which was pretty much behind in it's payments.

          One day, I get an enthusiastic phone call from him: Can you go to the airport and go to $AIRLINE offices to fix their macintosh??? (I was the outside mac expert). When I got there, the V.P. of finance was at the reception waiting for me and handed me a five-figure cheque for the outstanding invoiced...

          Turns out that this single computer had an AI application written in Smalltalk that handled all the logistics and scheduling of their aircraft fleet; their whole operations depended on this one computer.

          I was not able to fix the mac: it's motherboard was shot.

          A week later, they filed for bankrupcy but at least, the cheque cleared.
      • Re:Arrgh.. (Score:3, Informative)

        by dekeji ( 784080 )
        Unlike most of those, he's implemented those ideas, and has been spent implementing those- in real, live, usable code- for the last 30-some years

        Kay has done more than a lot of visionaries implementing his ideas. But have you actually tried to use Squeak or any of his other projects? They make neat demos. They demonstrate ideas very nicely. But I haven't found the "real, live, usable".

        Sadly, I find Squeak not even to be very useful for purposes where it should actually excel: user interface research
    • Re:Arrgh.. (Score:5, Insightful)

      by cagle_.25 ( 715952 ) on Tuesday July 13, 2004 @08:15AM (#9685164) Journal
      From the article,
      But a man like this cannot be dismissed merely because he occasionally creeps toward arrogance. What's much more important is that he does not merely complain. He has a vision and a team working to bring his alternate vision to reality.
      Alan's point is that the truly mathematical aspects of computing have become second-place to the eye-candy aspects. I think he's right, but I also think it was inevitable. Why would hordes of people that never loved math before all the sudden become mathematicians just because they have computers to use?

      Of course, Alan's aim is to change the tide. Hence, his work on Squeak. The goal for him is to use computers as a tool to enhance our thinking. More power to him.
      • Alan's point is that the truly mathematical aspects of computing have become second-place to the eye-candy aspects.

        No.

        "[H]e says, today's PC is too dedicated to replicating earlier tools, like ink and paper."

        [. . .]

        "Kay's ultimate dream is to completely remake the way we communicate with each other. At the least, he wants to enable people to collaborate and work together simply and elegantly. For him, 'the primary task of the Internet is to connect every person to every other person.'"
        • Re:Arrgh.. (Score:3, Informative)

          by Artifakt ( 700173 )
          Too dedicated to replacing earlier tools like ink and paper?

          Like having a "desktop" with "manila folders" where home users typically store "documents", even though these days half those 'documents' are music or video?

          Like showing a piece of paper being moved frrom manila folder to manila folder for file transfer dialogs?

          Like having a "recycle bin" that looks like a trash can and presumably catches those itty bitty pieces of paper that lurk inside the machine?

        • Re:Arrgh.. (Score:3, Informative)

          by cagle_.25 ( 715952 )
          I respectfully reaffirm my position. Quoting the larger context,

          If business users were less shortsighted, Kay says, they would seek to create computer models of their companies and constantly simulate potential changes. But the computers most business people use today are not suited for that. That's because, he says, today's PC is too dedicated to replicating earlier tools, like ink and paper. "[The PC] has a slightly better erase function but it isn't as nice to look at as a printed thing. The chances t

    • Re:Arrgh.. (Score:5, Interesting)

      by hcdejong ( 561314 ) <.hobbes. .at. .xmsnet.nl.> on Tuesday July 13, 2004 @08:22AM (#9685237)
      But he does have a point. Most of the effort that's gone into hardware and software development, has been aimed at doing the same things faster. Real innovation is very rare. Our desktops still are essentially the same as the 1984 Macintosh. PDAs still haven't caught up with the Newton. Computers are still dumb.
      • Re:Arrgh.. (Score:4, Interesting)

        by JUSTONEMORELATTE ( 584508 ) on Tuesday July 13, 2004 @08:42AM (#9685414) Homepage
        When Apple released the Newton, they knew that the handwriting recognition wouldn't work well for all users right out of the box, so they shipped a game which let the Newton learn how to recognize your particular handwriting.
        When USRobotics released the Pilot (later to become the Palm Pilot) they knew that the handwriting recognition wouldn't work well for all users right out of the box, so they shipped a game which let the user learn how to write the Pilot's particular handwriting.

        Bummer how things progress sometimes.

        --
        • Re:Arrgh.. (Score:4, Insightful)

          by hcdejong ( 561314 ) <.hobbes. .at. .xmsnet.nl.> on Tuesday July 13, 2004 @08:49AM (#9685482)
          Nah, when USRobotics released the Pilot (later to become the Palm Pilot) they knew that the handwriting recognition wouldn't work well, so they required you to learn the device's alphabet rather than allowing you to use your own.
          And it's not just the handwriting. On the Newton, you could enter 'lunch with Mariah' and the Newton would connect the name with that person's entry in the address book. 10 years later, my Palm still can't do that. Nor can my PC.
      • Re:Arrgh.. (Score:4, Insightful)

        by mcrbids ( 148650 ) on Tuesday July 13, 2004 @01:21PM (#9688860) Journal
        Our desktops still are essentially the same as the 1984 Macintosh. PDAs still haven't caught up with the Newton. Computers are still dumb.

        Computer technology is evolving. Quickly.

        Biological evolution took billions of years to get to today. Have you ever read up on Carl Sagan's Cosmic Calendar [discovery.com]? If you were to compress the known history of the universe into a single calendar year, all of written human history would comprise the last 15 seconds on December 31!

        Whether you're talking about technology or biology, you can't evolve anything too quickly, or you throw out all the accumulated wisdom in the current design. That's why birth defects and substantial changes in genetics are rare - evolve too quickly and the mortality rate climbs towards insolvency.

        The QWERTY keyboard is with us, perhaps for centuries to come, even though there are "better" alternatives. But these "better" alternatives cost alot more TODAY to develop and implement than continuing with the QWERTY. So if you "know how to type", you're using a QWERTY.

        To change to another keyboard, you have to throw out all the accumulated wisdom associated with QWERTY keyboards - all the trained office workers, all the existing equipment in place right now, the typing tutor software, the toys, cell phones, PDAs, etc.

        And why? The QWERTY is "good enough", so we invest our resources elsewhere.

        Here's another example: Joel on Software - Things You Should Never Do [joelonsoftware.com]. In this work, Joel claims that re-writing your nest egg software is the kiss of death for a software company, for the simple fact that in even a cruddy, poorly cobbled software, there's often many man-years of embedded wisdom in there - bugs fixed, design issues resolved, special cases handled, etc.

        You simply can't rebuild anything significant from scratch without tremendous cost. That's why our very sophisticated human cerebral brains are built upon the much simpler mammalian brain, which is in turn built upon the very simple lizard brain inside our heads. It's very literally three concentric sections of brain, with the lizard brain in the middle, the mamallian brain wrapped around that, and the cerebral cortex packed on around the outside!

        The biological cost of rebuilding our brains to factor out the now much-antiquated lizard brain functions is simply too high to be viable, so it's never happened, and the lizard brain is simply "infrastructure" for higher development.

        Look at the history of cities. You'll see the exact same pattern there... Example? Los Angeles has spent 75 years developing around the automobile, and their recent construction of subways have been extremely expensive (300 MILLION DOLLARS PER MILE) and the residual effects of the subway on local business has driven many to bankruptcy.

        It's been very costly, very slow, and cost overruns are the norm.

        So, when I hear somebody talk about making major changes to existing infrastructure, it's hard for me NOT to dismiss them, no matter their credentials. You simply *don't* change critical infrastructure of any kind without serious review and contemplation, and even then, you have to assume that it'd be 10x as costly and painful as you can imagine.
  • Creativity (Score:5, Funny)

    by muttoj ( 572791 ) on Tuesday July 13, 2004 @08:03AM (#9685059)
    I do not agree with the writer. I takes me a lot of creativity to find different ways to frag my friends in Battlefield 1942. Also playing battlefield teach me some nice skills for the real life. (press 9 for parachute whenever I fall out of a airplane and such)
  • Not-So-Sad Truth (Score:5, Insightful)

    by CommanderData ( 782739 ) * <kevinhi@y[ ]o.com ['aho' in gap]> on Tuesday July 13, 2004 @08:04AM (#9685064)
    From the Article:
    The chances that in the last week or year or month you've used the computer to simulate some interesting idea is zero--but that's what it's for.

    I'd have to disagree with Kay here, just because his work was with education and simulation doesn't mean that is really what computers are to be used for. They're the most unique and versatile tool ever invented by man, their purpose is whatever we choose it to be at the moment.
    • by Deag ( 250823 ) on Tuesday July 13, 2004 @08:11AM (#9685127)
      I have to agree with this, he comes across as bitter about something.


      Also from the article:


      Kay's ultimate dream is to completely remake the way we communicate with each other.


      I'd say this has been fairly achieved (It came across in the article that it hadn't been). I can't vouch personnally for 30 years ago, but i'd say the way we communicate with each other has changed alot since then - text messages, email, mobile phones are a different way of communicating then what it was.

    • Exactly. Just because Kay can't think outside of his box doesn't mean we should be captives of it. If everyone were using computers how he wants, they wouldn't really be "personal" anymore.

      As a programmer I put interesting ideas to good use and learn new things every day. The chance may be a lot smaller with average joes who just check their email but there's still a good amount of people who go deeper than that.
    • Re:Not-So-Sad Truth (Score:5, Interesting)

      by Scarblac ( 122480 ) <slashdot@gerlich.nl> on Tuesday July 13, 2004 @08:14AM (#9685153) Homepage

      They're the most unique and versatile tool ever invented by man, their purpose is whatever we choose it to be at the moment.

      I think that's his point - they're the most unique and versatile tool ever invented, we could do anything, but what we use it for is 99% things we basically had before - business documents and simple calculations, games, video and audio replay/recording.

      They could be so much more.

      • I don't think anyone would blow off a suggestion from Mr. Kay if has has another use. I agree that this is another vague and empty statement.
      • What we use it for is what we do every day, because that's what we've got to do. I do what my boss tells me to do, and I get paid, and that allows me to secure living space, food, and hopefully some entertainment and so on.

        That's not to say that we all use computers for the same old thing, but most of us do because we're not computer researchers, mkay ... we don't have time to make the computers do wonderous things ...
      • Why isn't there a program that graphically represents possibilities? Every one of us has to make complex decisions, each of which has a set of factors and pros and cons. Why can't the computer take this set of factors and "map" them, allow us to attach probabilities at each level, and then graphically highlight trouble areas and predict desirable outcomes.

        Things like deciding whether to carry X or Y product would be more tactile and visual, and probably more accurate than a flat spreadsheet. Hell, any
    • Sorry, but computers are the most unique and versatile tool EVER invented? Step away from the PC every now and again and check out the world.

      The Wheel? Levers? Arches? Steel? Medicine? A bajillion other things? The computer is great, but the world did plenty without them. The computer has made us all stupider for using them, I think.

      Anyone remember how to do long division?
      • I don't totally agree either, but it's not that far-fethced.

        First, he said "tool." Medicine is not a tool.

        Second, he said "versatile." In this case, versatile means "flexible" or "has many different uses."

        Computers can generall used for:
        Games
        Internet
        etc

        But we use them for controlling sytems (nukes, trains, planes, etc), running simulations, protein and DNA analysis, keeping people alive, telling us what time it is, communication, data storage, mathematics, encryption, and many more that we haven't eve
      • Re:Not-So-Sad Truth (Score:3, Interesting)

        by Telex4 ( 265980 )
        I wouldn't say that computers have made us less intelligent, but, along with TV and other "one-way" inventions it's certainly helped a lot of people be far less creative in their day to day activities.

        How many people now sit around playing computer games or watching TV rather than being creative?

        Of course there's a flipside, and a lot of people (including myself) are now far more creative with computers, but I think Alan Kay's point was that very few people fit this description, and that has a lot to do w
      • by gosand ( 234100 )
        Sorry, but computers are the most unique and versatile tool EVER invented? Step away from the PC every now and again and check out the world.

        Think of what computers have allowed us to do. Not just personal computers, all electronic computers. They are everywhere. Sure, they may be used for a lot of conveniences, but those are fantastic conveniences. Do you remember what it was like to check out at the grocery store 20 years ago? I cannot imagine doing that now. It takes minutes to run an entire cart

      • by Alioth ( 221270 )
        I would disagree with you. Whilst we would have never got where we have now without the wheel, the cantilever arch, suspension bridges, nuclear reactors and aircraft, the computer is unique.

        Our PCs are general purpose machines which can be other machines merely by describing what these machines are (the description of course being the program and data). A wheel is always a wheel. OK, sure it can be laid flat on its side and used to hold dirt and grow plants. A plane is always a plane. A computer however ca
    • It partly depends on how you define simulation.

      This week I have been reading news on the WWW, using my computer to simulate a newspaper.

      I have also used a CAD package to simualate a drawing board.

      I used a word processor to simulate a typewriter (with some improvements).

      Pretty well everything we do with computers can be considered a simulation, in that none of it actually exist and the reality is a bunch of electrons. Desktops, images, icons, fonts, etc., they are all simulations.

      Of course to keep

    • Simulate (Score:4, Informative)

      by base_chakra ( 230686 ) on Tuesday July 13, 2004 @08:36AM (#9685365)
      His specific complaints are understandable considering his long (and illustrious) career in computer science, but the underlying thesis is simply that we (including Alan Kay) haven't even begun to appreciate what computers can do. Kay yearns for a paradigm evolution, and considers our anchored situatedness to be detrimental.

      Please don't color the word "simulate" too much when reading Kay's words. To simulate is to recreate (approximate) one system in another system. Mathematics is a mode of simulation. The sole purpose of computers is simulation.
  • Without the PC, there are many geeks that wouldn't know the first thing about the female body. I feel that we are learning a lot!!
  • Personal? (Score:4, Interesting)

    by cowscows ( 103644 ) on Tuesday July 13, 2004 @08:06AM (#9685079) Journal
    I would think that since it's "Personal" computing, that the individual user can decide what it's all about to them. My mom uses her computer to keep in touch with me over email and instant messaging, and she trades stupid email jokes, programs, and malware with all of her friends. Those are pretty personal, non-business related uses if you ask me.

    Maybe Kay should've tried to call it the Educational Computer instead of Personal computer all those years ago.
  • This guy clearly does not post on or read slashdot ever. Nothing but learning and creativity here :-P

    Come to think of it, basically everything I ever do with my computer involves a certain amount of learning and creativity.

    Sounds like someone is lamenting their choice of employment -- just because HP is lacking in the forefront creativity department doesn't mean the last 20 years of computing development is in the toilet.

    Course by the time I hit submit, I'm sure there will be 50 other posts with this ex
    • Come to think of it, basically everything I ever do with my computer involves a certain amount of learning and creativity.

      What depresses me is that every creative idea I have for stuff to do with my computer, seems to require vast amounts of toil, scouring through documentation, and learning how to jump through some very arbitary-seeming hoops, to get to where I want to go. That's the reason why I spend very little time being creative at my computer.

  • werd (Score:5, Insightful)

    by RevAaron ( 125240 ) <revaaron@hotmail. c o m> on Tuesday July 13, 2004 @08:07AM (#9685094) Homepage
    Anyone who has spoken with him personally- in person or via email- or read his words, seen his vision knows this. Alan is *the* man.

    There's a great XEROX Video we've here at our uni library- "Doing with images makes symbols [videorecording] : communicating with computers," released in 1987 while Kay was a fellow with Apple. For an enthusiastic and engrossing view of what Kay thinks computers *should* be (and I'm 100% with him!) should check it out.

    Also, look into Smalltalk. Alan works on Squeak [squeak.org] Smalltalk [gatech.edu]- rather than C++ or Java- and there's a good reason for it. Smalltalk has the tendency to empower both end user and programmer. It's "open source" in a way that most slashdotters have never imagined. It's kind of like having your whole computer run Emacs, but without being stuck with some funky half-GUI half-terminal app with nothing but key commands to drive it. Squeak gives us the power to control our computing environment in a way similar to emacs, although Squeak is a lot closer to a "conventional" GUI environment than Emacs. That said, there are a lot of things about Squeak's GUI toolkit - Morphic- that are highly unconventional, but quite great to have around.

    OK, enough early morning rambling from me...
    • Re:werd (Score:3, Funny)

      by Speare ( 84249 )

      While I understand and empathize with your argument, and having used emacs for sixteen years, I must say, a recommendation that reads, "like emacs, only moreso!" will not sway your average personal computer user (or even your devotee) to try it out.

    • Re:werd (Score:4, Informative)

      by base_chakra ( 230686 ) on Tuesday July 13, 2004 @08:42AM (#9685415)
      For an enthusiastic and engrossing view of what Kay thinks computers *should* be (and I'm 100% with him!) should check it out.

      I agree! I notice that nearly every post I've read glibly dismisses Kay's assertions (after mere seconds of processing). It may feel empowering to contradict so dully a person who's thoughts are highly regarded, but that doesn't really do anything to elevate pundits' opinions--just the opposite.
    • Although I have a M.Sc. in Computer Science and taught myself LISP in 1979, I could not make sense of Squeak after I downloaded it. Yes, I could play around with it, but I failed to figure out how to write a program using it. I wonder how this can be the ideal programming environment for children.

      Squeak/Smalltalk is just another programming language and can hardly be seen as something that would revolutionize PC use. I agree with the observation that the current state of computing has not improved much in

      • by RevAaron ( 125240 ) <revaaron@hotmail. c o m> on Tuesday July 13, 2004 @09:21AM (#9685801) Homepage
        Squeak isn't all that hard to figure out. But if you're used to having a nicely written book, you can buy one- a couple exist for Squeak specifically. But Squeak's online documentation is lacking, there's no doubt about that- especially in the area of newbie reorientation. Making apps in Squeak is different than making apps in Java, C++ or even Lisp. The environment's different for one.

        The basic idea for creating a program in Squeak is to open up the Class Browser. Make a new class. Code away. Depending on what your program does, you may need more than one class.

        Or, you can make apps without doing it the old fashioned way. In Squeak, you can draw up your GUI, composing it with widgets out of the Morphic Toolbox, and then adding scripts. When this button is clicked, do this or that. Etc. There are some good tutorials for this newer way of making programs.

        It is an (not *the*- anything can be improved) ideal environment for kids- when you've got people teaching them. People used to coding in the same form for a long time often have a hard time learning Squeak. But then again, a lot of old assembly and C hackers have a hard time doing C++ or Java without spending a lot of time thinking about how to design OO systems instead of procedural ones. But old dogs can learn new tricks.

        I learned Smalltalk and Squeak on my own, teaching it to myself. I had no problem doing it. Didn't have a text book- or any book, for that matter. While there were even fewer online docs back in those days, that's where I started, but then moving to mostly exploring the system. In Smalltalk, you have the Class Browser, which allows one to browse the source code to anything in the system. I learned by example and by doing. So far, that's how I've learned every language I know, and doing it by just reading books doesn't work for me. When I wanted to know how to make a GUI, I looked at the source of the simplest built-in apps in Squeak, learning how a GUI was constructed. Then making something simple of my own, a simple Address Book. After that point, it's just a matter of checking the reference- that is, looking at the class hierarchy and for the methods that are provided.

        I think some personality types don't take well to this kind of exploratory programming, prefering to learn in a more passive way. That's fine- to each her own. Squeak tends to draw folks that do like that style of learning and doing. When it's learned, it is really handy. "Learning" Java for me didn't take that long, and it's mostly a matter of having the on-line class reference handy for me to write a program. In the best case, Squeak would provide more documentation for those who learned to program the old fashioned way, but in any OSS community, no one wants to be the one to write such docs. :P
  • by spidergoat2 ( 715962 ) on Tuesday July 13, 2004 @08:08AM (#9685113) Journal
    You can't teach anyone to be creative. You either are, or are not. That said, I think there are a few useful tools to aid the creative process, writing, drawing, music, etc., but I don't believe there are many, if any, tools to enhance the creative process. Maybe computers can't do that.
    • by cowscows ( 103644 ) on Tuesday July 13, 2004 @08:27AM (#9685296) Journal
      I entirely disagree. Just about everyone is born creative. Watch some little kids sometimes. When they get bored they'll take whatever toys they can get their hands on and use them as props to get completely absorbed in a storyline or world that their brain makes as they go. It may not be very complicated, but kids don't yet have much to base it on.

      Life does a good job of teaching us to be less creative. Our culture is so full of complicated yet boring things that we have to spend most of our time doing, and so creativity can often fall by the wayside. I'm glad that I had to take all of those math classes in grade school, but every hour that I spent doing my geometry homework was one less hour I could spend playing with photoshop. Now-a-days, I've not only got work to deal with, I've also got to spend my free time paying bills, going grocery shopping, cleaning the house, trying to understand what the hell is going on with the politics in my city, state, and country... when I sit down with a pad and paper and try to design a table that I need to build, I'm too tired to think.

      Sadly enough, I think things have gotten worse for kids as well. There are so many different toys, and they have such complex features, they almost take the need for creativity away. An example talked about often on /. is lego. When I was younger, I had a few random sets. Some spaceships, some the city, some just plain old blocks. And I made all sorts of crap. My next door neighbor had all of the sets from one of the spaceship series (including the badass monorail), but he was so obsessed with that series itself that he would just build each object according to the instructions, and sit it on the floor with all the others. He wouldn't dare take them apart, much less let me near them. The only decisions he made was which space station outpost got put next to the lunar landing pad. That jerk was pretty much the same way with all of his GI-joes too. Until I started throwing them down the stairs, he did enjoy that.

      Anyways, while some people are naturally better at being creative than others, doesn't mean many people are inherently unable to be creative. Creativity is one of the defining features of our intelligence. It's what puts our minds above those of animals. Anytime you aid the creative process, you improve it. It's not a learned skill persay, it's a Re-learned one.

    • by YouHaveSnail ( 202852 ) on Tuesday July 13, 2004 @09:01AM (#9685617)
      You can't teach anyone to be creative.

      I don't know if that's true or not, but you can definitely teach people not to be creative. And that's just exactly what we're doing when we don't give our kids enough art, music, math, and language education.

      You either are, or are not.

      Maybe, but I tend to think that mostly everyone is born with a creative brain. Some kids grow up learning that it's okay and fun and good to think outside the box and are encouraged to solve their own problems in their own ways. Others grow up getting smacked for coloring outside the lines and are told not to think for themselves.

      That said, I think there are a few useful tools to aid the creative process, writing, drawing, music, etc., but I don't believe there are many, if any, tools to enhance the creative process. Maybe computers can't do that.

      I'm not sure what the differences is between "aid" and "enhance" above, but one way that computers can aid/enhance the creative process is to stop impeding it. There's probably a whole book to be written on this topic (and Kay might be the guy to do it), but in short I think that software often tends to get in the way more than it helps.

      In the beginning, there were assorted ridiculous input systems such as punch cards, paper tapes, and (ha!) rows of switches. Computers weren't much fun to use, and way too expensive for most creative endeavors. (That's not to say that the pioneers of our industry weren't creative.) And then came terminals and command lines, and life was good! Much better than before, but still so expensive that you had to be a really smart and already creative college kid just to get to use one for a bit. (Read Steven Levy's "Hackers" for more on this.) Then came personal computers, which were relatively affordable and inspired all sorts of creativity.

      But still, we were stuck with the command line, and you pretty much needed to learn all about "right" and "wrong" ways to do things, and if you did something "wrong" the computer normally did something unfriendly. (Note that text adventure games were wildly popular during this time, possibly because they encouraged one to explore a new world, and aside from maybe getting temporarily killed there wasn't much that you could do that was "wrong.") When GUI's first came into public consciousness with the Apple Lisa (there were others, but a normal person might actually have a shot at touching a Lisa), there was a lot of interest because with this strange new computing paradigm, you could tell the computer to do whatever you wanted, whenever you wanted, and there was little that you could do that was "wrong." At $10,000, though, Lisa's were too expensive for most folk. Then the Mac came along and people loved it. It was relatively affordable, and easy to use, and people (Microsoft included) did all sorts of interesting things with it. Even with just two apps, MacWrite and MacPaint, people were transfixed for hours just playing and creating and exploring. About the worst thing you could do resulted in having to swap the floppy disk five or ten times.

      These days, computers are a lot more difficult and scary to use. No, don't open that attachment! You never know, it might contain a virus. Don't plug you computer into the network if you don't know the "right" way to do it, because hackers might take over your computer. Why did you set up your document like that? You've got it all wrong. Which of these 300 different commands that do a very specific thing do you want, and in what order?

      Tools which inspire creativity are simple ones which don't have a "right" and "wrong" way to use them. Tools like Logo and MacPaint and paintbrushes and drums. You get that sort of (software) tool most often in the early and middle phases of a products life, when a product is implemented enough to be useful, but before the manufacturer needs to justify the next seven updates and throws in all manner of kitchen sink features.

      Friends, it's time to demand simpl
  • Since I am old enough to experience and remember this I refute his assertion that business was the prime user at the PCs inception. PCs were the tools for education mainly (along w/Apple IIs).

    In 1987 businesses were finally ramping up with $10-20K PS/2s for CAD and other standalone work. Mainframe and minis were the big boys.

    In 1988, I interviewed with a recruiter for EDS. When I asked him where he saw PCs, he said EDS would never develop on them or for them, and that they would never catch on (how wrong
  • Software was meant to be free.

    I think Eben Moglen [wikipedia.org] puts it better in this interview [cabinetmagazine.org].
    • Re:The Profit Motive (Score:3, Interesting)

      by Telex4 ( 265980 )
      Indeed, the one theme Alan Kay didn't address is motivation.

      The "heydey" he speaks fondly of was one in which a great deal of development was done in labs in Universities or other geeky hacker havens. There you had a culture of creativity, sharing in communities and inspiring each other to create great new things. Perhaps that culture manifested itself in the technology they created.

      But now of course we have a culture that is increasingly commercialised and profit-orientated. The result? Exactly the probl
    • yeh, replying to my own post.

      Moglen was a programmer back in early the 70's. He wrote free software, not because of his ethics, but because all software was free back then. Software was a tool for users, and users were allowed to fix and improve the tools.

      Anyone could contribute to the state of the art by making a small contribution to the edge.

      The current proprietary regime blocks that. If you want one more feature in a proprietary word processor, you'd have to write a whole word processor first, and
  • by Bill, Shooter of Bul ( 629286 ) on Tuesday July 13, 2004 @08:13AM (#9685145) Journal
    1,000,000,000 windows computers on the earth, 1,000,000,000 windows computers. Take one down, replace the OS. 999,999,999 windows computers on the earth. 999,999,999 windows computers on the earth. 999,999,999 windows computers.Take one down, replace the OS. 999,999,998 windows computers on the earth. 999,999,998 windows computers on the earth. Take one down replace the OS. 999,999,997 windows computers on the earth. and so on.

    Maybe we should use something other than gentoo.
  • wait.... (Score:4, Funny)

    by eegad ( 588763 ) on Tuesday July 13, 2004 @08:14AM (#9685152)
    you mean it's not about patches and updates?
  • by defile ( 1059 ) on Tuesday July 13, 2004 @08:16AM (#9685171) Homepage Journal

    "The chances that in the last week or year or month you've used the computer to simulate some interesting idea is zero--but that's what it's for."

    Is the listener supposed to then ask a simple question like "what would you simulate?" and he would say "everything!" and the listener says "how do you do that?" and he says "by building a model of EVERYTHING!" and the listener, still not understanding what the value of "simulating everything" means, just writes him off as a kook who will research useless ideas for the rest of his life?

    Does anyone else understand his vision?

  • It's true (Score:2, Interesting)

    by 91degrees ( 207121 )
    The Windows PC is about as far from a home uers system as it's possible to get without also making it totally unsuitable for businesses.

    In reality, the correct way to go is to step back and look at how succesful home computers worked. Take for example, the commodire 64. This had a user interface that came up in about a second, and was immediately useable. Nobody ever looked at my C64 in a confused way wondering what it does. They knew. It was obvious.

    A windows PC on the other hand is a nasty co
    • Re:It's true (Score:2, Insightful)

      by Anonymous Coward
      " This had a user interface that came up in about a second, and was immediately useable."

      So does Dos. However, I've successfully frightened people by booting into Dos before. Y'see the little cursor and a complete lack of visual cues confuses the poor things.

      "A windows PC on the other hand is a nasty complicated mess."

      Hmm.

      "...Even the wiring needs some expertese in electronics"

      No it doesn't. You insert the plug into the socket. Also this applies to any computer made since the AT. I have a friend t
    • by mst76 ( 629405 ) on Tuesday July 13, 2004 @08:31AM (#9685326)
      Take for example, the commodire 64. This had a user interface that came up in about a second, and was immediately useable. Nobody ever looked at my C64 in a confused way wondering what it does. They knew. It was obvious.
      READY.
      HELP

      ?SYNTAX ERROR
      READY.
      HI

      ?SYNTAX ERROR
      READY.
      HELLO?

      ?SYNTAX ERROR
      READY.
      EAT FLAMING DEATH

      ?SYNTAX ERROR
      READY.
  • Yeah, we could have a world with a few people owning computers and being creative and the rest carrying out boring, simple tasks because we're too stupid to automate them, or we could have a world where we automate all of the boring, stupid tasks and people can spend their time being creative.
    • Nope, I just read it, and I'm still right. He seems to think that the computer is not being used well in business. I'm an investment banker who works mostly with small software companies, and the process automation software industry (better known as BPM) is something I work in a lot. Companies can automate everything now more easily than ever and spend their time doing business rather than doing paperwork.

      Moreover, Microsoft Excel is one of the most proliferated tools out there, and VERY few people use it
  • Half Speed (Score:3, Insightful)

    by krygny ( 473134 ) on Tuesday July 13, 2004 @08:20AM (#9685216)
    The whole computing industry can move only as fast as one company, and it's in that company's interest to move slowly. During the .com boom, the whole on-line industry moved as fast as the fastest company and we saw how much was done in just ten years. 20 years of Microsoft dominance has set the computer/software industry back 10 years. Another 20 years of dominance will allow us to only progress as much as we would otherwise in 10 years.
  • What-ifs (Score:5, Insightful)

    by MojoRilla ( 591502 ) on Tuesday July 13, 2004 @08:22AM (#9685234)
    Alan Kay says...

    "The chances that in the last week or year or month you've used the computer to simulate some interesting idea is zero--but that's what it's for."

    I disagree. Many business users use spreadsheets to "what-if". Perhaps he has a different idea of "interesting".
    • Seriously. I guess he's never heard of data visualization or business intelligence. That, or he just doesn't care that people actually ARE "creat[ing] computer models of their companies and constantly simulat[ing] potential changes."
  • (DARPA in the '60s, PARC in the '70s, now HP Labs)
    wow... how did he manage to get the '80s and '90s off?

    or was he working for the company with the three-letter acronym between PARC and HP? he better enjoy his current job while he's got it, because on this trend, he's only got one more employer left (and i have a hard time imagining Alan Kay working for X!).
  • Changing... (Score:5, Insightful)

    by digitalhermit ( 113459 ) on Tuesday July 13, 2004 @08:24AM (#9685256) Homepage
    When I first started with computers back in the early 80's there was a lot of energy in the community. People ran BBSs, built circuit boards to attach to print heads to scan images, built weather facsimile machines, tinkered and hacked and built stuff. Those days were very enjoyable. But the only downside was that all the little hacks were for the computer. I.e., the gadgets celebrated the technology and the coolness of doing new things, but they were all about the technology itself.

    Things have changed somewhat since then. There's still Linux and new experimental OSes (and BSDs too) to tinker with. Hardware is commoditized so there's not a lot of need or desire to build memory expansion boards, but people still do interesting things. However, the biggest change is that computers are now really cool tools for doing non-computer things.

    I can only speak to my interests, but without computers I could not have easily played with video or recording, ray tracing, music production, math (some problems *require* computers to understand, at least in my case), etc.. The computer today is akin to what the printing press was several centuries ago. I.e., it gives some very powerful tools to individuals of modest means. So things that were only the demesne of researchers and big companies ten years ago is now available in a relatively low powered desktop system.
  • Think about it (Score:3, Interesting)

    by TreadOnUS ( 796297 ) on Tuesday July 13, 2004 @08:24AM (#9685260) Homepage

    How far have we really come in the last 30+ years of personal computing?

    The personal computing revolution has stalled with the advent of the WWW. Excluding the MS virus, personal computing was making a lot of progress up until the mid 90's. Since then we've failed to truly exploit the power of both a computing platform and a means of communication. Somewhere along the way we've floundered. It's not necessarily a bad thing but think about where we could be.

    Listen to the guy. He's really just asking where should we be?

    • Re:Think about it (Score:5, Insightful)

      by JavaLord ( 680960 ) on Tuesday July 13, 2004 @09:11AM (#9685723) Journal
      The personal computing revolution has stalled with the advent of the WWW. Excluding the MS virus, personal computing was making a lot of progress up until the mid 90's. Since then we've failed to truly exploit the power of both a computing platform and a means of communication.

      I have to disagree. The real leap from 1995 until now has been usability and people getting connected the the internet. The number of PC's that are "out there" have increased dramatically. I'm 1995 I could talk to a few of my nerdier friends online. Now I can talk to just about everyone. Communication VIA computers has really taken off in the past 10 years. PC's over the past 15 years have come to the point where a person with minimal knowledge can use them for online communication.

      I would also say we should look at the business world, where there is a PC on every desktop. It wasn't like that in the 70's or 80's. Sure, maybe the PC isn't being used for some great learning experence for the world, but it is being used so people can do their jobs better including doctors and scientists. How much do you think PC's helped with mapping the genome? It probably worked out a lot more nicely than trying to get some timesharing system on a mainframe.

  • by Anonymous Coward on Tuesday July 13, 2004 @08:24AM (#9685262)
    Computers have made huge contributions to the art world. How can he think that we're falling short in creativity?

    I work in the music field and almost all the innovation in the last 10 years has come from computers (embedded at first, PCs more recently). With Reason [propellerheads.se], you can turn out a decent tune in minutes. Live [ableton.com] has introduced a whole new way to write and perform music. Those are my favorite examples but there are plenty more.

    The film and art worlds have been equally influenced by computer technology.
  • I'd rather write in Netscape Composer than Word...

    You can read a document in Microsoft Word, and write a document in Microsoft Word. But the people who did web browsers I think were too lazy to do the authoring part.

    Has Alan ever written a large document in Word? The program is designed for memoranda... it has precisely one nestable object, the table, and the program tries so hard to keep you from nesting them that I ended up embedding a table in a Visio document in a Word document to keep Word from ref
  • Croquet (Score:5, Informative)

    by lukeduff ( 156720 ) on Tuesday July 13, 2004 @08:27AM (#9685292) Homepage
    In techie terms, he is working on an infinitely scalable system for "real-time immersive collaboration done entirely as peer-to-peer machines."

    He's probably talking about Croquet [opencroquet.org] which is a 3d collaborative environment developed on top of Squeak. Impressive stuff.
  • by dekeji ( 784080 ) on Tuesday July 13, 2004 @08:28AM (#9685306)
    I agree with Kay. I also think Kay has made enormous contributions in the past. And I think that Squeak, his main project, is an enormously valuable tool. But, sadly,for all the great ideas that have gone into Squeak (and Kay's other work), I have not found the implementations he or others have produced to be very useful. Having great ideas is no good if you don't manage to implement them in ways that people can actually use.

    So, we have those who do the work implementing things that real people actually use (Gnome, KDE, Sun, Microsoft, Apple, etc.), and then we have those who talk about great ideas and grand schemes, but whose implementations aren't all that useful (Kay, the various "usability gurus", etc.). The first group doesn't do enough background research and/or just likes to pretend for PR reasons that they are "innovative". The second group likes to complain about how awful things are but then just doesn't quite get their act together producing something more useful than they do.

    How can we improve things? Things get better the more like Kay take actual implementations a little more seriously and people in "industry" stop reinventing the wheel. And software developers and end users need to become a bit more informed about the products they use and make better choices, instead of just buying what's popular or hip.
  • by CarrionBird ( 589738 ) on Tuesday July 13, 2004 @08:30AM (#9685322) Journal
    There are some great things being done. They are just being drowned out by the vast majority of PC users who don't care. To most people it's an appliance, it's an internet toaster.

    The net result of the consumerization of the PC and the internet is a landscape that only want's to hear about what can be packaged and marketed.

  • He's not wrong... (Score:3, Insightful)

    by Pig Hogger ( 10379 ) <pig.hogger@gmail ... m minus caffeine> on Tuesday July 13, 2004 @08:37AM (#9685370) Journal
    I've been touching computers for close to 30 years, and working with them for 25 years, and what we have now is not functionally different from what he had then.

    The only difference is eye candy like menus, windows and whatnot.

    Otherwise, it's pretty much the same, and, even when you put in particularly creative applications like Photoshop, Illustrator/Freehand, Autocad or any music composing system, you basically have "a better version of an older tool, pen and paper".

    There aren't really NEW applications that are really creative; perhaps the only thing that goes close would be USENET if it wasn't swamped by the line noise...

  • Kay still at HP? (Score:3, Informative)

    by chess ( 40930 ) on Tuesday July 13, 2004 @08:40AM (#9685397)
    RTFA, I had the impression of a man that is trapped in the wrong company.

    Since active cynic Carly took over, there is no HP any more.
    It's NewAgeP: No more research needed - except for how to supress printer ink refilling. Product creation sold to Intel (when she notices the chipset guys are doing well, she'd sell those poor souls to Intel too).
    Corporate Culture vaporized. Business-is-adding-a-sticker attitude.

    What is this guy sitting for on his chair at HP?

    chess
  • He's got a point.. (Score:5, Insightful)

    by Bigman ( 12384 ) on Tuesday July 13, 2004 @08:41AM (#9685406) Homepage Journal
    I'm old enough to remember the early days - my first computer was a 8k PET.. While the technology was primative, computers where sold as creative devices. My PET had a built in interpreter, and it switched on straight to the command prompt. The machine, by its nature, encouraged you to get involved with programming, because it was so simple. Yes, there where word processing packages, games and the like, and you got used to loading and running these, but all the time you knew that the real fun was learning to program.
    Nowadays, a Windows PC doesn't even come with any kind of programming language (not counting batch files..) and the GUI metaphor discourages automation of tasks (which was the Great Hope that computing promised..)
    The internet has been converted from a facinating library to some sort of dumb TV plastered with adverts... The increasing and unfettered commercialisation of the internet is gradually making it unusable. I can't even get my site listed on Google, never mind high up the list, because Google's more interested these days in promoting commercial sites. And don't get me started on spammers (unless I've a 2x4 in my hand!)
  • by rackrent ( 160690 ) on Tuesday July 13, 2004 @08:43AM (#9685429)
    One of the subtexts of Kay's commentary seems to be that most operating systems train you how to use them, whereas I think he would like to see the actual person make the computer perform the functions that they would like them too.

    A subtle distinction, I know, but I remember helping teach a class on LOGO a long time ago (ok I was a geek at age 12), and that was the advantage of it for little kids.....they were in charge of the computer, not the other way around. I don't see that philosophy as much today in the widely distributed programs.
  • Well... (Score:3, Insightful)

    by HarveyBirdman ( 627248 ) on Tuesday July 13, 2004 @08:45AM (#9685447) Journal
    I feel Alan's pain, but what he fails to understand is this:

    Most people are not creative, and most hate to learn. This is a sad truth. The amount of people who like to learn new things throughout their life, or create things just for the sake of creating, is a thin sliver of the general population.

    I like to do 3D computer art, and have started programming for fun again after a long lapse. Most people who know me, many of them professionals wiuth advanced degrees, can't grasp why I want to do it as they turn back to their latest Grisham lawyer epic.

    The sad truth is that the state of personal computing is exatly what the market (i.e. the consumers) wanted. They want games and pr0n and free music. No about of hand wringing or high falutin' pondering is going to change that.

    The other problem:

    For him, "the primary task of the Internet is to connect every person to every other person."

    When people say stuff like this, they are only really thinking about his friends and family, or maybe some small collection of online pals.

    You really want to be connected with atrocities like stompthejews.org or purty-yung-thangs-only-mildy-related-to-yoo.xxx or microsoft.com?

    Honestly, what is all this infinite connectivity going to brings us over what we have now?

    And business, he says, "is basically not interested in creative uses for computers."

    No, it's just not interested in what Alan Kay is interested in.

    The guy is brilliant, and he's done great work, but I'm afraid he's developed the tunnelvision common to people who have had their eogs stroked (no matter how well deserved) for many years. There's some small businesses out there able to automate things that would have required a lot of tedious drudgework in past decades thanks to those "uncreative" business applications.

    Sorry, Alan, but behiond all the educations and fancy learning objects, there's still a world to run, resources to move about and daily chores to be done. And we're going to use boring gray box computin' machines for it.

    "pretty much everything that's believed is bullshit."

    OK, now here I agree with him. :-) But he might want to apply the bullshit test to his own beliefs. I try to do it on a regular basis. It's sometimes painful to let go of a closely held belief, but if the facts do not support it, you have to dispose of it.

  • by JavaLord ( 680960 ) on Tuesday July 13, 2004 @08:49AM (#9685488) Journal
    The chances that in the last week or year or month you've used the computer to simulate some interesting idea is zero--but that's what it's for.

    Dude, I use it every night to simulate a girlfriend, and that is pretty damn interesting.

    Kay should take a break from all of this research BS and check out some of the great porn on the internet. He wouldn't be so down on the state of the industry then.
  • by buckhead_buddy ( 186384 ) on Tuesday July 13, 2004 @09:04AM (#9685653)
    The language Squeak wasn't my introduction to object-oriented programming, but having stumbled on Java I found Squeak to really be a much better object oriented learning environment. No language treats "everything" as an object despite their claims, but Squeak really comes darn close.

    The Squeak programming environment along with the Korienek, Wrensch, and Dechow book [aw-bc.com] were what made the idea of Object-Oriented programming really click in my brain. Even if you never program a "real" program with Squeak, the value of Squeak is that you can really learn OO principles without the baggage of a C heritage and designers who've shortcut language consistency in the name of efficiency. All are good things you may want to make the trade off for when programming a "real" program, but not things you want to short yourself on during your education.
  • by ZakMcCracken ( 753422 ) on Tuesday July 13, 2004 @09:06AM (#9685679)
    I always remind myself that before the Internet stepped in, I did use my computer for creativity, especially music composition (on various "trackers" for Amiga, if you must know).

    Come to think of it, it was pretty amazing given the poor technology of the times (a mere 2 MB RAM, endless floppy-swapping -- later, a "huge" 20 MB HD). The creativity of the programmers was itself amazing. They did their sound mixing routines alright, MIDI + sample synchro, and the user interface--the user interface!!--was the best thing ever.

    And yet today, maybe 100 times the number of Windows PCs is out there, with 100 times the CPU power each, but I still can't find an honest tracker for my Windows machine-- when I say honest, I mean that won't crash my PC or will ask that I buy a damn compatible soundcard. I also mean "free," I mean come on, who's going to spend C-notes worth of professional sequencer software for just dabbling around!!

    Dudes in the 90's, up there in Finland & other places, were swamping Europe with their trackers at a time when "electronic distribution" was a euphemism for a network of enthusiasts swapping floppies through the post and holding "copy parties" in each other's place.

    Now we got the Internet for distribtion, we got fairly less fragmentation in the OS space, and you'd have thought it'd all have made it much easier?? Think again!

    Sure, back then we weren't able to download Britney Spears MP3 for free... Hell, if we had, we wouldn't even have had the CPU power to decode it!! But what's the new thing there? I mean, you just listen to the same music as in the store, except cheaper...

    To conclude: quit consuming pr0n and mp3's, start coding mind-opening stuff for masses to discover their own talents!

    (and stop reading / posting on Slashdot too, I might add)
  • by skids ( 119237 ) on Tuesday July 13, 2004 @09:17AM (#9685771) Homepage
    He was probably just talking above the reporter's head (or what the reporter considered to be above his audience's head.) Or, he himself hasn't found the way to express what he'd like to see in terms most people would understand.

    Most people do use computers primarily to simulate objects that they understand because they have physical samples of those objects (appliances, documents, etc.) in front of them in their daily life. What I took as his meaning was that the computer's ability to make manifest ideas and concepts that do not have common tangible real-world instances is commonly neglected, and should not be. In this respect he is entirely correct.

    But the problem in my view is not that noone has tried to foster such uses by making computers easier to use and understand in this capacity. There have been plenty of attempts to do so, many of them in games, some in teaching languages like TURTLE. It is rather that there are few examples in real life of using manifestation of abstract objects to do something useful, or at least entertaining. Face it, most people don't subject themselves to a sit-down session with a computer unless they think they are going to get something out of it, and "modeling" intangible systems is a hard sell in this respect, especially for those who have not been taught the intellectual building blocks needed to approach such a task with any degree of confidence.

    Maybe if there were a collection someplace of testimonials and explanations by those few who have managed to get a signifigant real-world benefit from doing something truly abstract it would inspire users. Some would argue that applications are that very thing, but what I'm suggesting would be more of an explanation of the human process involved -- how a person thought his way through a new or unusual application of a core technology to improve their life, rather than a spoon-fed procedural guide to doing the same thing without comprehending the thoughts behind it, which is what most applications are in the end.

    A popular game that had a programming component could also break the ice by making it into entertainment, but making it popular versus all the competition would be the obstacle to that...
  • by peter303 ( 12292 ) on Tuesday July 13, 2004 @09:23AM (#9685818)
    He is running on fumes. He did great stuff in the 1970s inventing SmallTalk, developing graphics GUIs, a formulating the "Dynabook", the early PDA. This stimulated Jobs and Gates to commercialize graphical computing and OOP-based OS's. But since then Kay hasn't really invented that much, missed "industrial-strength" OOP, missed the significance of the Web, PDAs, cellphones and other innovations. The Gore-Gates initative to make the Web available in every school and library by year 2000 did far more for children computing access than SmallTalk and eQuariums.

    (Lets see if the moderators can distinguish a contrarian opinion from troll-bait.)
  • by buckhead_buddy ( 186384 ) on Tuesday July 13, 2004 @09:36AM (#9685981)
    I think that there are many places computer science and computers could help the average joe understand something in the same way the pocket calculator helped give the math innumerate a tool to keep from getting lost in day to day life.

    When the Michigan Senator (D) in the (highly recommended) movie Fahrenheit 9/11 [apple.com] responded bluntly to the question "Why didn't you read the Patriot Act before passing it?" with the response "Sit down my son, we don't read most of the bills we pass." It was quite laughable but very chilling.

    Legal ignorance is at an appalling level, even among people paid and elected to represent us. Computers are good at pattern recognition; and most people despise reading the mumbo-jumbo lawyers hide their meaning within.

    Perhaps a "pocket lawyer" to help parse legal mumbo jumbo is a worthwhile thing. For most people law is a one-way street, you have to read what the IRS, city, and state send to you but you rarely have to write anything yourself. (Though Nolo and some other "mad lib" style books do a wonderful job of this).

    While there are lawyers who are trying to be devious and hide their real purpose in contorted language, government agencies should have no need to do so. Require that court rulings, city councils, and any record of law be stated in English and Backus-Naur form [wikipedia.org]. Rely less on the vagueries of English to preserve or hide your meaning while the OED is changing the language (bling-bling? vavavoom?) and hence changing the law through its evolution.

  • Yes, and... (Score:3, Insightful)

    by jridley ( 9305 ) on Tuesday July 13, 2004 @09:43AM (#9686065)
    "television will be a wonderful medium for the masses to enjoy the benefits of culture and education."

    The truth is that people make any general purpose media or device do what they want to do, or relegate it to irrelevancy. What most people want is to be passively entertained (couch potatoes). Build a device that can only be used for lofty goals, and nobody will buy it.
  • by scamizdat ( 795700 ) on Tuesday July 13, 2004 @10:13AM (#9686394)
    When a severely brain-damaged friend of my son gets to free himself for a few hours with Counterstrike, where he can jump and twirl and join the general melee as any other kid, I know Alan Kay is decidedly wrong.
  • Behind the Times (Score:3, Insightful)

    by BlackHawk-666 ( 560896 ) on Tuesday July 13, 2004 @10:51AM (#9686841)
    He may have (and was) a great pioneer, but these days I think he's too busy playing with *his* old toys to notice the world has changed around him. He says we mainly read the web, and yet every person posting here is *writing* the web. He has overlooked the impact of CMS systems and more importantly wiki. Why no metion of Skype, bittorrent, 3 degrees of seperation or any form of IM? Step aside old man, let the young lions continue your work or let the scales fall from your eyes.
  • by 12357bd ( 686909 ) on Tuesday July 13, 2004 @11:19AM (#9687301)

    After more than 20 years programming my opinion is that Alan Kay is right. Those who are older enough know that there were expectatives (ie: computers will understand human languaje), now there are refinements (oh, look at that, spell-check on any text entry, wow!),

    Even the most succesful idea on those years, the web, was already (and probably better) designed in the Xanadu project.

    Hardware is still worse, one single schema, a single processing units, lots of memory, and a hard disk, that's all. Were are those prolog machines? I remember a small english company that build a nice small blue box able to outperform some CRAYs on graphic processing. That was creativity.

    Computing has fallen by his own success, there was bussines and money to get, now big corporations are unable to do a thing but continue with the same old crap. Of course innovation is lost, the only thing that gives software an edge is that is a personal activity, that's why open source still remains. But the big picture is depressing, sofware is under MS control, and harware is under Intel directions, that's falling short friends, very very short.l

  • by jasenj1 ( 575309 ) on Tuesday July 13, 2004 @12:19PM (#9688120)
    I believe people are reading WAY too much into a little one page article in a magazine directed at finance types with sprinkles of quotes from Alan Kay in it.

    Some simple rules for reading anything written by a "journalist".
    1. The more you know about a subject the more the journalist will get wrong.

    2. The shorter the article the more will be left out and gotten wrong.

    3. The more complex the subject the more will be gotten wrong regardless of article length.

    So in this case we have a short article by a journalist of unknown technical credentials writing for a target audience with no technical credentials, and people are complaining that the small quotes from someone with DEEP technical credentials on a VAST subject area are bozo-y? Please. Show me an article _BY_ Alan Kay written for the ACM and then I'll pay attention. This article is just fodder for CEOs to annoy their IT shop with.

    PHB: Alan Kay says we should be modeling our business so we can make more money. Get on it.

    IT: I'll get right to it after I install the latest critical Windows/IE update and wipe the latest virus from all the machines on our network. (i.e. Never.)

    - Jasen.
  • by MagikSlinger ( 259969 ) on Tuesday July 13, 2004 @01:33PM (#9689031) Homepage Journal

    I found this passage from the middle captures his arguments succinctly:

    [Kay] says, "[Business] is basically not interested in creative uses for computers."

    Depends on the busines. Most businesses want predictable, repeatably, accurate, auditable activity done with their PCs. Accounting is an example of a business that does not WANT creativity. :-) I am assuming he's not talking about this bread-n-butter computing problems but what's done on the desktop, but he also has to remember that the desktop user also has to work in that "boring" business environment, and most jobs discourage creativity in order to "maximise efficiency".

    Some jobs will benefit from creativity, and in those cases, most people feel their PCs (especially the Mac crowd) do encourage their creativity. But I can't help wonder if he's so obsessed with being creative that he's ignoring the fact some people don't need creativity in their jobs, also, if they are being creative, they don't want to be creative int he way he wants to be creative.

    If business users were less shortsighted, Kay says, they would seek to create computer models of their companies and constantly simulate potential changes.

    Here's an example of his disconnect. Maybe they're not doing it in the way Kay wants to see it done, but it's done all the time with various tools, but mostly spreadsheet based ones using plug-ins for Excel. People find the spreadsheet the most comfortable tool for modeling things and simulating their company on paper. Hell, there are some really nifty 3rd party plug-ins for Excel that can do Monte Carlo simulation on your spreadsheet data. You provide some extra information about your values, like variance, etc., and the plug-in will calculate the outcome curve of your model. And there are some really cool tools for MS Project to model how your project works!

    From my perspective, modeling happens all the time and people are using their imaginations to model and work with some really nifty things. From small businesses to the home user figuring out their portfolio balance to the engineering company using their PC to model new ways of designing structures! It just might not be the way Kay wants to do it.

    I think Key is confusing the way he wants to be creative and how he thinks with how everyone else should think. Berating people for not thinking like you do is, to me, the anti-thesis of creativity.

    But the computers most business people use today are not suited for that. That's because, he says, today's PC is too dedicated to replicating earlier tools, like ink and paper. "[The PC] has a slightly better erase function but it isn't as nice to look at as a printed thing.

    I think he's trying to say that PCs should transcend just trying to be a poor simulacrum of pen and paper. On the surface, that sounds seductive: your PC should take all that drudgery away from you leaving you free to think. Let the PC do all the thinking and work and you do all the creativity. As someone who likes to think of himself as creative, that sounds... stupid. Painters like the feel of paint on canvas. Harlan Ellison loves the effort it takes to push the keys on his mechanical typewriter. Most artists consider the "drudgery" part of the creative process. It's a challenge to your imagination that spurs you forward. The effort of collecting and working the clay is considered a key part of the pottery making process. Just going to a shop to buy the clay is considered death to the process. Being truely creative is about taking all there is inside you and expressing it. Making it "easier" is missing the point.

    Kay also believes that the drudgery inhibits creativity; which it doesn't. You will be creative even if you have to use a stone and cliff face. Making it easier will not increase your creativity, nor will it improve its quality. If you want to make PCs more use

    • your PC should take all that drudgery away from you leaving you free to think. Let the PC do all the thinking and work and you do all the creativity. As someone who likes to think of himself as creative, that sounds... stupid. [...] Most artists consider the "drudgery" part of the creative process. [...] Being truly creative is about taking all there is inside you and expressing it. Making it "easier" is missing the point. Kay also believes that the drudgery inhibits creativity; which it doesn't. (Please

"Imitation is the sincerest form of television." -- The New Mighty Mouse

Working...