Alan Kay Decries the State of Computing 479
gnaremooz writes "Computer pioneer Alan Kay (DARPA in the '60s, PARC in the '70s, now HP Labs) declares 'The sad truth is that 20 years or so of commercialization have almost completely missed the point of what personal computing is about.' He believes that PCs should be tools for creativity and learning, and they are falling short."
Arrgh.. (Score:4, Interesting)
Instead of doing [insert clearly-defined practical thing here], you should be doing [insert vague semi-buzzword here, like "education", or "object"] and you should be using [insert visionary's product here] to do it.
Re:Arrgh.. (Score:5, Insightful)
Instead of doing [insert clearly-defined practical thing here], you should be doing [insert vague semi-buzzword here, like "education", or "object"] and you should be using [insert visionary's product here] to do it.
Not quite...
While people are certainly welcome to disagree with Kay's vision, he's not in the same barrel of monkey that most so-called visionarise and pundits live. Unlike most of those, he's implemented those ideas, and has been spent implementing those- in real, live, usable code- for the last 30-some years. Kay doesn't have a product, he's got nothing in a box to sell. He does have an idea to sell, though you don't pay for it with your money. He's been doing it in a very practical way for 30 years, not just making vague promises.
Re:Arrgh.. (Score:2, Interesting)
Re:Arrgh.. (Score:5, Insightful)
Thats only true if you insist that the messages that pass between the computers have to be executable code. In the real world I don't think that is necessary or desirable.
This was actually the subject of a long conversation Uri Rabinski and I had with Alan he spoke at the Darmstat WWW conference. Alan had been pushing the idea that PDF was a better model for information interchange than HTML because in PDF the content was encapsulated with the code that interpreted it and gave it semantics. Tim Berners-Lee later joined in the conversation but did not get any further with Alan than Uri and I.
Needless to say I did not agree with this idea, and at the time it would be impossible to move PDFs arround as the core of the Web since they are typically five to ten times the size of the equivalent HTML and a fast modem was 28.8Kb/sec. But at a more fundamental level, with HTML google is possible, with PDF you are reduced to screen scraping technologies. HTML can render well to almost any output device (or rather could before being bastardized by netscape) PDF renders badly to anything other than paper the same size as the original rendering.
If you exchange declarative statements rather than programs firewalls don't represent a barrier. This is exactly what we have in the biological world (which Alan had used as analogy), cells do not accept raw DNA from the outside and run it. Viruses have to bypass these defenses.
I am not sure what Alan is up to here, the person who wrote the article clearly has a much less good idea of what Alan is up to than Alan.
Sure there are problems with most software. Word sucks, as do most HTML editors, despite all the pretty graphics sloshed into HTML there are still no good tools for producing printed output. Open source alternatives suck even worse, we get a bad copy of Word and several bad HTML editors. Same for Excel and spreadsheets.
If Wolfram had spent the last ten years doing something more important than writing a book that claims he is the modern Newton, mathematica might have gone somewhere interesting. Unfortunately it has gone from being a niche market tool for scientists to being a niche market tool for scientists and some engineers.
Re:Arrgh.. (Score:3, Informative)
This is a long-running joke/troll here on
Re:Arrgh.. (Score:3, Insightful)
He does have a product.. He has his reputation as a visionary. In his line of work, that's more important than any software application or widget.
His example: A software package that just looks like
Re:Arrgh.. (Score:5, Interesting)
Don't belittle Smallalk . It ain't. Case in point: some years ago, a friend of mine had the misfortune of having sold beaucoup computers and servers to an ailing airline, which was pretty much behind in it's payments.
One day, I get an enthusiastic phone call from him: Can you go to the airport and go to $AIRLINE offices to fix their macintosh??? (I was the outside mac expert). When I got there, the V.P. of finance was at the reception waiting for me and handed me a five-figure cheque for the outstanding invoiced...
Turns out that this single computer had an AI application written in Smalltalk that handled all the logistics and scheduling of their aircraft fleet; their whole operations depended on this one computer.
I was not able to fix the mac: it's motherboard was shot.
A week later, they filed for bankrupcy but at least, the cheque cleared.Re:Arrgh.. (Score:3, Funny)
Re:Arrgh.. (Score:5, Informative)
I've no acronym for this, but Know What You're Talking About (KWYTA?). Squeak *is* Smalltalk. It's not the only Smalltalk dialect there is, but it the fastest growing Smalltalk, the Smalltalk with the biggest online community around it.
If you run a LOGO implementation, written in C, on top of your Linux/X11 box, you don't say that "C is nothing but LOGO," or "Man, leenux suxors, all you can do is play with LOGO" do you? You can use Squeak in a number of ways. You can use the eToys scripting system, which is what I assume you are thinking of as modernized LOGO. Or, if for some reason you feel more "adult" doing so, you can write the GUI in all of your apps in a purely programmatic way. Or, you could do what most Squeakers do- just get the job done in the way that makes sense.
Re:Arrgh.. (Score:4, Insightful)
The first contact I had with programming was LOGO at a very young age. My answer to that question is that LOGO doesn't take it far enough, doesn't provide room to grow in. Squeak does. A person- a kid or adult- can learn the basics of programming ala LOGO using Squeak. But when she does, it's not just making a turtle move around the screen with simple procedural commands, rather getting down the idea of creating objects, and then attaching actions to them. Perhaps not a huge difference on the surface, but when it comes down to learning OOP [1] it is an important distinction. Unlike LOGO, that basic, core intuitive knowledge of OOP programming can be expanded upon within the same environment, and this learner can make the step up, going from just making balls bounce around the screen to writing a simple rolodex application with the same principles and no code; then make the step to writing database driven webapps with the Seaside webapp framework and the MySQL driver, or even better, the Magma object database.
[1] Not to say that I think OOP is any sort of end-all-be-all, especially as it's imagined to be in the industry. But for someone learning to program at 15 years old right now, real knowledge of OOP would come in handy when they get their first job programming when they turn 20- OOP won't be some ancient COBOLian relic, something you've heard of but no one ever uses.
Re:Arrgh.. (Score:3, Interesting)
Personally, I love the Squeak UI. The everything-is-an-object metaphor is literally built into the UI. You can inspect any part of the system and play with it as you see fit. Moreover, it has one of the most powerful development environments I've ever encountered. It's really *quite* cool, and if there were enou
Re:Arrgh.. (Score:3, Informative)
Re:Arrgh.. (Score:3, Informative)
Kay has done more than a lot of visionaries implementing his ideas. But have you actually tried to use Squeak or any of his other projects? They make neat demos. They demonstrate ideas very nicely. But I haven't found the "real, live, usable".
Sadly, I find Squeak not even to be very useful for purposes where it should actually excel: user interface research
Re:Arrgh.. (Score:5, Insightful)
Of course, Alan's aim is to change the tide. Hence, his work on Squeak. The goal for him is to use computers as a tool to enhance our thinking. More power to him.
Re:Arrgh.. (Score:2)
No.
"[H]e says, today's PC is too dedicated to replicating earlier tools, like ink and paper."
[. .
"Kay's ultimate dream is to completely remake the way we communicate with each other. At the least, he wants to enable people to collaborate and work together simply and elegantly. For him, 'the primary task of the Internet is to connect every person to every other person.'"
Re:Arrgh.. (Score:3, Informative)
Like having a "desktop" with "manila folders" where home users typically store "documents", even though these days half those 'documents' are music or video?
Like showing a piece of paper being moved frrom manila folder to manila folder for file transfer dialogs?
Like having a "recycle bin" that looks like a trash can and presumably catches those itty bitty pieces of paper that lurk inside the machine?
Re:Arrgh.. (Score:3, Informative)
Re:Arrgh.. (Score:5, Interesting)
Re:Arrgh.. (Score:4, Interesting)
When USRobotics released the Pilot (later to become the Palm Pilot) they knew that the handwriting recognition wouldn't work well for all users right out of the box, so they shipped a game which let the user learn how to write the Pilot's particular handwriting.
Bummer how things progress sometimes.
--
Re:Arrgh.. (Score:4, Insightful)
And it's not just the handwriting. On the Newton, you could enter 'lunch with Mariah' and the Newton would connect the name with that person's entry in the address book. 10 years later, my Palm still can't do that. Nor can my PC.
Re:Arrgh.. (Score:5, Funny)
0 fred@discworld ~ > lunch with mariah
bash: lunch: command not found
0 fred@discworld ~ > su -
Password:
0 root@discworld ~ > urpmi lunch
no package named lunch
0 root@discworld ~ >
damn
Re:Arrgh.. (Score:4, Informative)
Re:Arrgh.. (Score:4, Insightful)
Computer technology is evolving. Quickly.
Biological evolution took billions of years to get to today. Have you ever read up on Carl Sagan's Cosmic Calendar [discovery.com]? If you were to compress the known history of the universe into a single calendar year, all of written human history would comprise the last 15 seconds on December 31!
Whether you're talking about technology or biology, you can't evolve anything too quickly, or you throw out all the accumulated wisdom in the current design. That's why birth defects and substantial changes in genetics are rare - evolve too quickly and the mortality rate climbs towards insolvency.
The QWERTY keyboard is with us, perhaps for centuries to come, even though there are "better" alternatives. But these "better" alternatives cost alot more TODAY to develop and implement than continuing with the QWERTY. So if you "know how to type", you're using a QWERTY.
To change to another keyboard, you have to throw out all the accumulated wisdom associated with QWERTY keyboards - all the trained office workers, all the existing equipment in place right now, the typing tutor software, the toys, cell phones, PDAs, etc.
And why? The QWERTY is "good enough", so we invest our resources elsewhere.
Here's another example: Joel on Software - Things You Should Never Do [joelonsoftware.com]. In this work, Joel claims that re-writing your nest egg software is the kiss of death for a software company, for the simple fact that in even a cruddy, poorly cobbled software, there's often many man-years of embedded wisdom in there - bugs fixed, design issues resolved, special cases handled, etc.
You simply can't rebuild anything significant from scratch without tremendous cost. That's why our very sophisticated human cerebral brains are built upon the much simpler mammalian brain, which is in turn built upon the very simple lizard brain inside our heads. It's very literally three concentric sections of brain, with the lizard brain in the middle, the mamallian brain wrapped around that, and the cerebral cortex packed on around the outside!
The biological cost of rebuilding our brains to factor out the now much-antiquated lizard brain functions is simply too high to be viable, so it's never happened, and the lizard brain is simply "infrastructure" for higher development.
Look at the history of cities. You'll see the exact same pattern there... Example? Los Angeles has spent 75 years developing around the automobile, and their recent construction of subways have been extremely expensive (300 MILLION DOLLARS PER MILE) and the residual effects of the subway on local business has driven many to bankruptcy.
It's been very costly, very slow, and cost overruns are the norm.
So, when I hear somebody talk about making major changes to existing infrastructure, it's hard for me NOT to dismiss them, no matter their credentials. You simply *don't* change critical infrastructure of any kind without serious review and contemplation, and even then, you have to assume that it'd be 10x as costly and painful as you can imagine.
Creativity (Score:5, Funny)
Re:Creativity (Score:3, Funny)
This has saved me countless times since I started taking my keyboard on flights.
Now the TSA have started confiscating them, what am I supported to do?
Re:Creativity (Score:2)
Not-So-Sad Truth (Score:5, Insightful)
The chances that in the last week or year or month you've used the computer to simulate some interesting idea is zero--but that's what it's for.
I'd have to disagree with Kay here, just because his work was with education and simulation doesn't mean that is really what computers are to be used for. They're the most unique and versatile tool ever invented by man, their purpose is whatever we choose it to be at the moment.
Re:Not-So-Sad Truth (Score:5, Insightful)
Also from the article:
Kay's ultimate dream is to completely remake the way we communicate with each other.
I'd say this has been fairly achieved (It came across in the article that it hadn't been). I can't vouch personnally for 30 years ago, but i'd say the way we communicate with each other has changed alot since then - text messages, email, mobile phones are a different way of communicating then what it was.
Re:Not-So-Sad Truth (Score:3, Insightful)
As a programmer I put interesting ideas to good use and learn new things every day. The chance may be a lot smaller with average joes who just check their email but there's still a good amount of people who go deeper than that.
Re:Not-So-Sad Truth (Score:5, Interesting)
They're the most unique and versatile tool ever invented by man, their purpose is whatever we choose it to be at the moment.
I think that's his point - they're the most unique and versatile tool ever invented, we could do anything, but what we use it for is 99% things we basically had before - business documents and simple calculations, games, video and audio replay/recording.
They could be so much more.
Re:Not-So-Sad Truth (Score:3, Interesting)
Re:Not-So-Sad Truth (Score:2)
That's not to say that we all use computers for the same old thing, but most of us do because we're not computer researchers, mkay
He's right, we need a "situations planner" (Score:3, Interesting)
Things like deciding whether to carry X or Y product would be more tactile and visual, and probably more accurate than a flat spreadsheet. Hell, any
Re:Not-So-Sad Truth (Score:3, Insightful)
The Wheel? Levers? Arches? Steel? Medicine? A bajillion other things? The computer is great, but the world did plenty without them. The computer has made us all stupider for using them, I think.
Anyone remember how to do long division?
Re:Not-So-Sad Truth (Score:3, Insightful)
First, he said "tool." Medicine is not a tool.
Second, he said "versatile." In this case, versatile means "flexible" or "has many different uses."
Computers can generall used for:
Games
Internet
etc
But we use them for controlling sytems (nukes, trains, planes, etc), running simulations, protein and DNA analysis, keeping people alive, telling us what time it is, communication, data storage, mathematics, encryption, and many more that we haven't eve
Re:Not-So-Sad Truth (Score:3, Interesting)
How many people now sit around playing computer games or watching TV rather than being creative?
Of course there's a flipside, and a lot of people (including myself) are now far more creative with computers, but I think Alan Kay's point was that very few people fit this description, and that has a lot to do w
Re:Not-So-Sad Truth (Score:3, Interesting)
Take any cultural field, and you're almost guaranteed that the commercial sector is less innovative and more populist than other sectors. That's fine, but you need the non-commercial and less commercial sectors, otherwise you're left with stagnant pap. Commercial sectors don't like risk, and their whole outlook is based on financial considerations.
It is mind-boggling (Score:3, Insightful)
Think of what computers have allowed us to do. Not just personal computers, all electronic computers. They are everywhere. Sure, they may be used for a lot of conveniences, but those are fantastic conveniences. Do you remember what it was like to check out at the grocery store 20 years ago? I cannot imagine doing that now. It takes minutes to run an entire cart
Re:Not-So-Sad Truth (Score:3, Insightful)
Our PCs are general purpose machines which can be other machines merely by describing what these machines are (the description of course being the program and data). A wheel is always a wheel. OK, sure it can be laid flat on its side and used to hold dirt and grow plants. A plane is always a plane. A computer however ca
Re:Not-So-Sad Truth (Score:2)
This week I have been reading news on the WWW, using my computer to simulate a newspaper.
I have also used a CAD package to simualate a drawing board.
I used a word processor to simulate a typewriter (with some improvements).
Pretty well everything we do with computers can be considered a simulation, in that none of it actually exist and the reality is a bunch of electrons. Desktops, images, icons, fonts, etc., they are all simulations.
Of course to keep
Simulate (Score:4, Informative)
Please don't color the word "simulate" too much when reading Kay's words. To simulate is to recreate (approximate) one system in another system. Mathematics is a mode of simulation. The sole purpose of computers is simulation.
Learning, they are used for learning (Score:2, Funny)
Personal? (Score:4, Interesting)
Maybe Kay should've tried to call it the Educational Computer instead of Personal computer all those years ago.
Don't Quit your day (desk) job buddy... (Score:2, Insightful)
Come to think of it, basically everything I ever do with my computer involves a certain amount of learning and creativity.
Sounds like someone is lamenting their choice of employment -- just because HP is lacking in the forefront creativity department doesn't mean the last 20 years of computing development is in the toilet.
Course by the time I hit submit, I'm sure there will be 50 other posts with this ex
Re:Don't Quit your day (desk) job buddy... (Score:3, Insightful)
What depresses me is that every creative idea I have for stuff to do with my computer, seems to require vast amounts of toil, scouring through documentation, and learning how to jump through some very arbitary-seeming hoops, to get to where I want to go. That's the reason why I spend very little time being creative at my computer.
werd (Score:5, Insightful)
There's a great XEROX Video we've here at our uni library- "Doing with images makes symbols [videorecording] : communicating with computers," released in 1987 while Kay was a fellow with Apple. For an enthusiastic and engrossing view of what Kay thinks computers *should* be (and I'm 100% with him!) should check it out.
Also, look into Smalltalk. Alan works on Squeak [squeak.org] Smalltalk [gatech.edu]- rather than C++ or Java- and there's a good reason for it. Smalltalk has the tendency to empower both end user and programmer. It's "open source" in a way that most slashdotters have never imagined. It's kind of like having your whole computer run Emacs, but without being stuck with some funky half-GUI half-terminal app with nothing but key commands to drive it. Squeak gives us the power to control our computing environment in a way similar to emacs, although Squeak is a lot closer to a "conventional" GUI environment than Emacs. That said, there are a lot of things about Squeak's GUI toolkit - Morphic- that are highly unconventional, but quite great to have around.
OK, enough early morning rambling from me...
Re:werd (Score:3, Funny)
While I understand and empathize with your argument, and having used emacs for sixteen years, I must say, a recommendation that reads, "like emacs, only moreso!" will not sway your average personal computer user (or even your devotee) to try it out.
Re:werd (Score:4, Informative)
I agree! I notice that nearly every post I've read glibly dismisses Kay's assertions (after mere seconds of processing). It may feel empowering to contradict so dully a person who's thoughts are highly regarded, but that doesn't really do anything to elevate pundits' opinions--just the opposite.
Squeak - Not intuitive (Score:3, Interesting)
Squeak/Smalltalk is just another programming language and can hardly be seen as something that would revolutionize PC use. I agree with the observation that the current state of computing has not improved much in
Re:Squeak - Not intuitive (Score:5, Informative)
The basic idea for creating a program in Squeak is to open up the Class Browser. Make a new class. Code away. Depending on what your program does, you may need more than one class.
Or, you can make apps without doing it the old fashioned way. In Squeak, you can draw up your GUI, composing it with widgets out of the Morphic Toolbox, and then adding scripts. When this button is clicked, do this or that. Etc. There are some good tutorials for this newer way of making programs.
It is an (not *the*- anything can be improved) ideal environment for kids- when you've got people teaching them. People used to coding in the same form for a long time often have a hard time learning Squeak. But then again, a lot of old assembly and C hackers have a hard time doing C++ or Java without spending a lot of time thinking about how to design OO systems instead of procedural ones. But old dogs can learn new tricks.
I learned Smalltalk and Squeak on my own, teaching it to myself. I had no problem doing it. Didn't have a text book- or any book, for that matter. While there were even fewer online docs back in those days, that's where I started, but then moving to mostly exploring the system. In Smalltalk, you have the Class Browser, which allows one to browse the source code to anything in the system. I learned by example and by doing. So far, that's how I've learned every language I know, and doing it by just reading books doesn't work for me. When I wanted to know how to make a GUI, I looked at the source of the simplest built-in apps in Squeak, learning how a GUI was constructed. Then making something simple of my own, a simple Address Book. After that point, it's just a matter of checking the reference- that is, looking at the class hierarchy and for the methods that are provided.
I think some personality types don't take well to this kind of exploratory programming, prefering to learn in a more passive way. That's fine- to each her own. Squeak tends to draw folks that do like that style of learning and doing. When it's learned, it is really handy. "Learning" Java for me didn't take that long, and it's mostly a matter of having the on-line class reference handy for me to write a program. In the best case, Squeak would provide more documentation for those who learned to program the old fashioned way, but in any OSS community, no one wants to be the one to write such docs.
I'd disagree somewhat... (Score:3, Interesting)
Re:I'd disagree somewhat... (Score:5, Insightful)
Life does a good job of teaching us to be less creative. Our culture is so full of complicated yet boring things that we have to spend most of our time doing, and so creativity can often fall by the wayside. I'm glad that I had to take all of those math classes in grade school, but every hour that I spent doing my geometry homework was one less hour I could spend playing with photoshop. Now-a-days, I've not only got work to deal with, I've also got to spend my free time paying bills, going grocery shopping, cleaning the house, trying to understand what the hell is going on with the politics in my city, state, and country... when I sit down with a pad and paper and try to design a table that I need to build, I'm too tired to think.
Sadly enough, I think things have gotten worse for kids as well. There are so many different toys, and they have such complex features, they almost take the need for creativity away. An example talked about often on
Anyways, while some people are naturally better at being creative than others, doesn't mean many people are inherently unable to be creative. Creativity is one of the defining features of our intelligence. It's what puts our minds above those of animals. Anytime you aid the creative process, you improve it. It's not a learned skill persay, it's a Re-learned one.
Re:I'd disagree somewhat... (Score:5, Interesting)
I don't know if that's true or not, but you can definitely teach people not to be creative. And that's just exactly what we're doing when we don't give our kids enough art, music, math, and language education.
You either are, or are not.
Maybe, but I tend to think that mostly everyone is born with a creative brain. Some kids grow up learning that it's okay and fun and good to think outside the box and are encouraged to solve their own problems in their own ways. Others grow up getting smacked for coloring outside the lines and are told not to think for themselves.
That said, I think there are a few useful tools to aid the creative process, writing, drawing, music, etc., but I don't believe there are many, if any, tools to enhance the creative process. Maybe computers can't do that.
I'm not sure what the differences is between "aid" and "enhance" above, but one way that computers can aid/enhance the creative process is to stop impeding it. There's probably a whole book to be written on this topic (and Kay might be the guy to do it), but in short I think that software often tends to get in the way more than it helps.
In the beginning, there were assorted ridiculous input systems such as punch cards, paper tapes, and (ha!) rows of switches. Computers weren't much fun to use, and way too expensive for most creative endeavors. (That's not to say that the pioneers of our industry weren't creative.) And then came terminals and command lines, and life was good! Much better than before, but still so expensive that you had to be a really smart and already creative college kid just to get to use one for a bit. (Read Steven Levy's "Hackers" for more on this.) Then came personal computers, which were relatively affordable and inspired all sorts of creativity.
But still, we were stuck with the command line, and you pretty much needed to learn all about "right" and "wrong" ways to do things, and if you did something "wrong" the computer normally did something unfriendly. (Note that text adventure games were wildly popular during this time, possibly because they encouraged one to explore a new world, and aside from maybe getting temporarily killed there wasn't much that you could do that was "wrong.") When GUI's first came into public consciousness with the Apple Lisa (there were others, but a normal person might actually have a shot at touching a Lisa), there was a lot of interest because with this strange new computing paradigm, you could tell the computer to do whatever you wanted, whenever you wanted, and there was little that you could do that was "wrong." At $10,000, though, Lisa's were too expensive for most folk. Then the Mac came along and people loved it. It was relatively affordable, and easy to use, and people (Microsoft included) did all sorts of interesting things with it. Even with just two apps, MacWrite and MacPaint, people were transfixed for hours just playing and creating and exploring. About the worst thing you could do resulted in having to swap the floppy disk five or ten times.
These days, computers are a lot more difficult and scary to use. No, don't open that attachment! You never know, it might contain a virus. Don't plug you computer into the network if you don't know the "right" way to do it, because hackers might take over your computer. Why did you set up your document like that? You've got it all wrong. Which of these 300 different commands that do a very specific thing do you want, and in what order?
Tools which inspire creativity are simple ones which don't have a "right" and "wrong" way to use them. Tools like Logo and MacPaint and paintbrushes and drums. You get that sort of (software) tool most often in the early and middle phases of a products life, when a product is implemented enough to be useful, but before the manufacturer needs to justify the next seven updates and throws in all manner of kitchen sink features.
Friends, it's time to demand simpl
The PC was not initially used by businesses (Score:2)
In 1987 businesses were finally ramping up with $10-20K PS/2s for CAD and other standalone work. Mainframe and minis were the big boys.
In 1988, I interviewed with a recruiter for EDS. When I asked him where he saw PCs, he said EDS would never develop on them or for them, and that they would never catch on (how wrong
The Profit Motive (Score:2, Interesting)
I think Eben Moglen [wikipedia.org] puts it better in this interview [cabinetmagazine.org].
Re:The Profit Motive (Score:3, Interesting)
The "heydey" he speaks fondly of was one in which a great deal of development was done in labs in Universities or other geeky hacker havens. There you had a culture of creativity, sharing in communities and inspiring each other to create great new things. Perhaps that culture manifested itself in the technology they created.
But now of course we have a culture that is increasingly commercialised and profit-orientated. The result? Exactly the probl
unfair moderation of parent post? (Score:3, Interesting)
Moglen was a programmer back in early the 70's. He wrote free software, not because of his ethics, but because all software was free back then. Software was a tool for users, and users were allowed to fix and improve the tools.
Anyone could contribute to the state of the art by making a small contribution to the edge.
The current proprietary regime blocks that. If you want one more feature in a proprietary word processor, you'd have to write a whole word processor first, and
This could take a while (Score:3, Funny)
Maybe we should use something other than gentoo.
wait.... (Score:4, Funny)
What the hell is he talking about? (Score:5, Insightful)
"The chances that in the last week or year or month you've used the computer to simulate some interesting idea is zero--but that's what it's for."
Is the listener supposed to then ask a simple question like "what would you simulate?" and he would say "everything!" and the listener says "how do you do that?" and he says "by building a model of EVERYTHING!" and the listener, still not understanding what the value of "simulating everything" means, just writes him off as a kook who will research useless ideas for the rest of his life?
Does anyone else understand his vision?
It's true (Score:2, Interesting)
In reality, the correct way to go is to step back and look at how succesful home computers worked. Take for example, the commodire 64. This had a user interface that came up in about a second, and was immediately useable. Nobody ever looked at my C64 in a confused way wondering what it does. They knew. It was obvious.
A windows PC on the other hand is a nasty co
Re:It's true (Score:2, Insightful)
So does Dos. However, I've successfully frightened people by booting into Dos before. Y'see the little cursor and a complete lack of visual cues confuses the poor things.
"A windows PC on the other hand is a nasty complicated mess."
Hmm.
"...Even the wiring needs some expertese in electronics"
No it doesn't. You insert the plug into the socket. Also this applies to any computer made since the AT. I have a friend t
Re:It's true (Score:5, Funny)
HELP
?SYNTAX ERROR
READY.
HI
?SYNTAX ERROR
READY.
HELLO?
?SYNTAX ERROR
READY.
EAT FLAMING DEATH
?SYNTAX ERROR
READY.
bull (bear in mind, I did not read the article yet (Score:3, Interesting)
Re:bull (bear in mind, I did not read the article (Score:2)
Moreover, Microsoft Excel is one of the most proliferated tools out there, and VERY few people use it
Half Speed (Score:3, Insightful)
What-ifs (Score:5, Insightful)
"The chances that in the last week or year or month you've used the computer to simulate some interesting idea is zero--but that's what it's for."
I disagree. Many business users use spreadsheets to "what-if". Perhaps he has a different idea of "interesting".
Re:What-ifs (Score:2)
wow, that's quite a schedule! (Score:2)
or was he working for the company with the three-letter acronym between PARC and HP? he better enjoy his current job while he's got it, because on this trend, he's only got one more employer left (and i have a hard time imagining Alan Kay working for X!).
Changing... (Score:5, Insightful)
Things have changed somewhat since then. There's still Linux and new experimental OSes (and BSDs too) to tinker with. Hardware is commoditized so there's not a lot of need or desire to build memory expansion boards, but people still do interesting things. However, the biggest change is that computers are now really cool tools for doing non-computer things.
I can only speak to my interests, but without computers I could not have easily played with video or recording, ray tracing, music production, math (some problems *require* computers to understand, at least in my case), etc.. The computer today is akin to what the printing press was several centuries ago. I.e., it gives some very powerful tools to individuals of modest means. So things that were only the demesne of researchers and big companies ten years ago is now available in a relatively low powered desktop system.
Re:Changing... (Score:3, Interesting)
Think about it (Score:3, Interesting)
How far have we really come in the last 30+ years of personal computing?
The personal computing revolution has stalled with the advent of the WWW. Excluding the MS virus, personal computing was making a lot of progress up until the mid 90's. Since then we've failed to truly exploit the power of both a computing platform and a means of communication. Somewhere along the way we've floundered. It's not necessarily a bad thing but think about where we could be.
Listen to the guy. He's really just asking where should we be?
Re:Think about it (Score:5, Insightful)
I have to disagree. The real leap from 1995 until now has been usability and people getting connected the the internet. The number of PC's that are "out there" have increased dramatically. I'm 1995 I could talk to a few of my nerdier friends online. Now I can talk to just about everyone. Communication VIA computers has really taken off in the past 10 years. PC's over the past 15 years have come to the point where a person with minimal knowledge can use them for online communication.
I would also say we should look at the business world, where there is a PC on every desktop. It wasn't like that in the 70's or 80's. Sure, maybe the PC isn't being used for some great learning experence for the world, but it is being used so people can do their jobs better including doctors and scientists. How much do you think PC's helped with mapping the genome? It probably worked out a lot more nicely than trying to get some timesharing system on a mainframe.
Creativity is covered (Score:3, Interesting)
I work in the music field and almost all the innovation in the last 10 years has come from computers (embedded at first, PCs more recently). With Reason [propellerheads.se], you can turn out a decent tune in minutes. Live [ableton.com] has introduced a whole new way to write and perform music. Those are my favorite examples but there are plenty more.
The film and art worlds have been equally influenced by computer technology.
I'd rather write in Netscape Composer... (Score:2)
You can read a document in Microsoft Word, and write a document in Microsoft Word. But the people who did web browsers I think were too lazy to do the authoring part.
Has Alan ever written a large document in Word? The program is designed for memoranda... it has precisely one nestable object, the table, and the program tries so hard to keep you from nesting them that I ended up embedding a table in a Visio document in a Word document to keep Word from ref
Croquet (Score:5, Informative)
He's probably talking about Croquet [opencroquet.org] which is a 3d collaborative environment developed on top of Squeak. Impressive stuff.
yeah, but what is he doing about it? (Score:3, Insightful)
So, we have those who do the work implementing things that real people actually use (Gnome, KDE, Sun, Microsoft, Apple, etc.), and then we have those who talk about great ideas and grand schemes, but whose implementations aren't all that useful (Kay, the various "usability gurus", etc.). The first group doesn't do enough background research and/or just likes to pretend for PR reasons that they are "innovative". The second group likes to complain about how awful things are but then just doesn't quite get their act together producing something more useful than they do.
How can we improve things? Things get better the more like Kay take actual implementations a little more seriously and people in "industry" stop reinventing the wheel. And software developers and end users need to become a bit more informed about the products they use and make better choices, instead of just buying what's popular or hip.
Computers as a consumer product (Score:3, Insightful)
The net result of the consumerization of the PC and the internet is a landscape that only want's to hear about what can be packaged and marketed.
He's not wrong... (Score:3, Insightful)
The only difference is eye candy like menus, windows and whatnot.
Otherwise, it's pretty much the same, and, even when you put in particularly creative applications like Photoshop, Illustrator/Freehand, Autocad or any music composing system, you basically have "a better version of an older tool, pen and paper".
There aren't really NEW applications that are really creative; perhaps the only thing that goes close would be USENET if it wasn't swamped by the line noise...
Kay still at HP? (Score:3, Informative)
Since active cynic Carly took over, there is no HP any more.
It's NewAgeP: No more research needed - except for how to supress printer ink refilling. Product creation sold to Intel (when she notices the chipset guys are doing well, she'd sell those poor souls to Intel too).
Corporate Culture vaporized. Business-is-adding-a-sticker attitude.
What is this guy sitting for on his chair at HP?
chess
He's got a point.. (Score:5, Insightful)
Nowadays, a Windows PC doesn't even come with any kind of programming language (not counting batch files..) and the GUI metaphor discourages automation of tasks (which was the Great Hope that computing promised..)
The internet has been converted from a facinating library to some sort of dumb TV plastered with adverts... The increasing and unfettered commercialisation of the internet is gradually making it unusable. I can't even get my site listed on Google, never mind high up the list, because Google's more interested these days in promoting commercial sites. And don't get me started on spammers (unless I've a 2x4 in my hand!)
Learning or Learning? (Score:3, Insightful)
A subtle distinction, I know, but I remember helping teach a class on LOGO a long time ago (ok I was a geek at age 12), and that was the advantage of it for little kids.....they were in charge of the computer, not the other way around. I don't see that philosophy as much today in the widely distributed programs.
Well... (Score:3, Insightful)
Most people are not creative, and most hate to learn. This is a sad truth. The amount of people who like to learn new things throughout their life, or create things just for the sake of creating, is a thin sliver of the general population.
I like to do 3D computer art, and have started programming for fun again after a long lapse. Most people who know me, many of them professionals wiuth advanced degrees, can't grasp why I want to do it as they turn back to their latest Grisham lawyer epic.
The sad truth is that the state of personal computing is exatly what the market (i.e. the consumers) wanted. They want games and pr0n and free music. No about of hand wringing or high falutin' pondering is going to change that.
The other problem:
For him, "the primary task of the Internet is to connect every person to every other person."
When people say stuff like this, they are only really thinking about his friends and family, or maybe some small collection of online pals.
You really want to be connected with atrocities like stompthejews.org or purty-yung-thangs-only-mildy-related-to-yoo.xxx or microsoft.com?
Honestly, what is all this infinite connectivity going to brings us over what we have now?
And business, he says, "is basically not interested in creative uses for computers."
No, it's just not interested in what Alan Kay is interested in.
The guy is brilliant, and he's done great work, but I'm afraid he's developed the tunnelvision common to people who have had their eogs stroked (no matter how well deserved) for many years. There's some small businesses out there able to automate things that would have required a lot of tedious drudgework in past decades thanks to those "uncreative" business applications.
Sorry, Alan, but behiond all the educations and fancy learning objects, there's still a world to run, resources to move about and daily chores to be done. And we're going to use boring gray box computin' machines for it.
"pretty much everything that's believed is bullshit."
OK, now here I agree with him. :-) But he might want to apply the bullshit test to his own beliefs. I try to do it on a regular basis. It's sometimes painful to let go of a closely held belief, but if the facts do not support it, you have to dispose of it.
Screw you Kay (Score:4, Funny)
Dude, I use it every night to simulate a girlfriend, and that is pretty damn interesting.
Kay should take a break from all of this research BS and check out some of the great porn on the internet. He wouldn't be so down on the state of the industry then.
Squeak and Objects (Score:3)
The Squeak programming environment along with the Korienek, Wrensch, and Dechow book [aw-bc.com] were what made the idea of Object-Oriented programming really click in my brain. Even if you never program a "real" program with Squeak, the value of Squeak is that you can really learn OO principles without the baggage of a C heritage and designers who've shortcut language consistency in the name of efficiency. All are good things you may want to make the trade off for when programming a "real" program, but not things you want to short yourself on during your education.
Creativity in the early '90s (Score:3, Interesting)
Come to think of it, it was pretty amazing given the poor technology of the times (a mere 2 MB RAM, endless floppy-swapping -- later, a "huge" 20 MB HD). The creativity of the programmers was itself amazing. They did their sound mixing routines alright, MIDI + sample synchro, and the user interface--the user interface!!--was the best thing ever.
And yet today, maybe 100 times the number of Windows PCs is out there, with 100 times the CPU power each, but I still can't find an honest tracker for my Windows machine-- when I say honest, I mean that won't crash my PC or will ask that I buy a damn compatible soundcard. I also mean "free," I mean come on, who's going to spend C-notes worth of professional sequencer software for just dabbling around!!
Dudes in the 90's, up there in Finland & other places, were swamping Europe with their trackers at a time when "electronic distribution" was a euphemism for a network of enthusiasts swapping floppies through the post and holding "copy parties" in each other's place.
Now we got the Internet for distribtion, we got fairly less fragmentation in the OS space, and you'd have thought it'd all have made it much easier?? Think again!
Sure, back then we weren't able to download Britney Spears MP3 for free... Hell, if we had, we wouldn't even have had the CPU power to decode it!! But what's the new thing there? I mean, you just listen to the same music as in the store, except cheaper...
To conclude: quit consuming pr0n and mp3's, start coding mind-opening stuff for masses to discover their own talents!
(and stop reading / posting on Slashdot too, I might add)
He's Right (just poorly expressed) (Score:3, Insightful)
Most people do use computers primarily to simulate objects that they understand because they have physical samples of those objects (appliances, documents, etc.) in front of them in their daily life. What I took as his meaning was that the computer's ability to make manifest ideas and concepts that do not have common tangible real-world instances is commonly neglected, and should not be. In this respect he is entirely correct.
But the problem in my view is not that noone has tried to foster such uses by making computers easier to use and understand in this capacity. There have been plenty of attempts to do so, many of them in games, some in teaching languages like TURTLE. It is rather that there are few examples in real life of using manifestation of abstract objects to do something useful, or at least entertaining. Face it, most people don't subject themselves to a sit-down session with a computer unless they think they are going to get something out of it, and "modeling" intangible systems is a hard sell in this respect, especially for those who have not been taught the intellectual building blocks needed to approach such a task with any degree of confidence.
Maybe if there were a collection someplace of testimonials and explanations by those few who have managed to get a signifigant real-world benefit from doing something truly abstract it would inspire users. Some would argue that applications are that very thing, but what I'm suggesting would be more of an explanation of the human process involved -- how a person thought his way through a new or unusual application of a core technology to improve their life, rather than a spoon-fed procedural guide to doing the same thing without comprehending the thoughts behind it, which is what most applications are in the end.
A popular game that had a programming component could also break the ice by making it into entertainment, but making it popular versus all the competition would be the obstacle to that...
And what has Alan Kay done since 1980? (Score:3, Interesting)
(Lets see if the moderators can distinguish a contrarian opinion from troll-bait.)
Patterns and the lazy human attention span (Score:4, Insightful)
When the Michigan Senator (D) in the (highly recommended) movie Fahrenheit 9/11 [apple.com] responded bluntly to the question "Why didn't you read the Patriot Act before passing it?" with the response "Sit down my son, we don't read most of the bills we pass." It was quite laughable but very chilling.
Legal ignorance is at an appalling level, even among people paid and elected to represent us. Computers are good at pattern recognition; and most people despise reading the mumbo-jumbo lawyers hide their meaning within.
Perhaps a "pocket lawyer" to help parse legal mumbo jumbo is a worthwhile thing. For most people law is a one-way street, you have to read what the IRS, city, and state send to you but you rarely have to write anything yourself. (Though Nolo and some other "mad lib" style books do a wonderful job of this).
While there are lawyers who are trying to be devious and hide their real purpose in contorted language, government agencies should have no need to do so. Require that court rulings, city councils, and any record of law be stated in English and Backus-Naur form [wikipedia.org]. Rely less on the vagueries of English to preserve or hide your meaning while the OED is changing the language (bling-bling? vavavoom?) and hence changing the law through its evolution.
Yes, and... (Score:3, Insightful)
The truth is that people make any general purpose media or device do what they want to do, or relegate it to irrelevancy. What most people want is to be passively entertained (couch potatoes). Build a device that can only be used for lofty goals, and nobody will buy it.
Alan Kay's blind arrogance (Score:3, Interesting)
Behind the Times (Score:3, Insightful)
Yeah! Computing smells... (Score:3, Interesting)
After more than 20 years programming my opinion is that Alan Kay is right. Those who are older enough know that there were expectatives (ie: computers will understand human languaje), now there are refinements (oh, look at that, spell-check on any text entry, wow!),
Even the most succesful idea on those years, the web, was already (and probably better) designed in the Xanadu project.
Hardware is still worse, one single schema, a single processing units, lots of memory, and a hard disk, that's all. Were are those prolog machines? I remember a small english company that build a nice small blue box able to outperform some CRAYs on graphic processing. That was creativity.
Computing has fallen by his own success, there was bussines and money to get, now big corporations are unable to do a thing but continue with the same old crap. Of course innovation is lost, the only thing that gives software an edge is that is a personal activity, that's why open source still remains. But the big picture is depressing, sofware is under MS control, and harware is under Intel directions, that's falling short friends, very very short.l
Reporters always get it wrong. (Score:3, Insightful)
Some simple rules for reading anything written by a "journalist".
1. The more you know about a subject the more the journalist will get wrong.
2. The shorter the article the more will be left out and gotten wrong.
3. The more complex the subject the more will be gotten wrong regardless of article length.
So in this case we have a short article by a journalist of unknown technical credentials writing for a target audience with no technical credentials, and people are complaining that the small quotes from someone with DEEP technical credentials on a VAST subject area are bozo-y? Please. Show me an article _BY_ Alan Kay written for the ACM and then I'll pay attention. This article is just fodder for CEOs to annoy their IT shop with.
PHB: Alan Kay says we should be modeling our business so we can make more money. Get on it.
IT: I'll get right to it after I install the latest critical Windows/IE update and wipe the latest virus from all the machines on our network. (i.e. Never.)
- Jasen.
Deconstructing Kaye (Score:3, Insightful)
I found this passage from the middle captures his arguments succinctly:
Depends on the busines. Most businesses want predictable, repeatably, accurate, auditable activity done with their PCs. Accounting is an example of a business that does not WANT creativity. :-) I am assuming he's not talking about this bread-n-butter computing problems but what's done on the desktop, but he also has to remember that the desktop user also has to work in that "boring" business environment, and most jobs discourage creativity in order to "maximise efficiency".
Some jobs will benefit from creativity, and in those cases, most people feel their PCs (especially the Mac crowd) do encourage their creativity. But I can't help wonder if he's so obsessed with being creative that he's ignoring the fact some people don't need creativity in their jobs, also, if they are being creative, they don't want to be creative int he way he wants to be creative.
Here's an example of his disconnect. Maybe they're not doing it in the way Kay wants to see it done, but it's done all the time with various tools, but mostly spreadsheet based ones using plug-ins for Excel. People find the spreadsheet the most comfortable tool for modeling things and simulating their company on paper. Hell, there are some really nifty 3rd party plug-ins for Excel that can do Monte Carlo simulation on your spreadsheet data. You provide some extra information about your values, like variance, etc., and the plug-in will calculate the outcome curve of your model. And there are some really cool tools for MS Project to model how your project works!
From my perspective, modeling happens all the time and people are using their imaginations to model and work with some really nifty things. From small businesses to the home user figuring out their portfolio balance to the engineering company using their PC to model new ways of designing structures! It just might not be the way Kay wants to do it.
I think Key is confusing the way he wants to be creative and how he thinks with how everyone else should think. Berating people for not thinking like you do is, to me, the anti-thesis of creativity.
I think he's trying to say that PCs should transcend just trying to be a poor simulacrum of pen and paper. On the surface, that sounds seductive: your PC should take all that drudgery away from you leaving you free to think. Let the PC do all the thinking and work and you do all the creativity. As someone who likes to think of himself as creative, that sounds... stupid. Painters like the feel of paint on canvas. Harlan Ellison loves the effort it takes to push the keys on his mechanical typewriter. Most artists consider the "drudgery" part of the creative process. It's a challenge to your imagination that spurs you forward. The effort of collecting and working the clay is considered a key part of the pottery making process. Just going to a shop to buy the clay is considered death to the process. Being truely creative is about taking all there is inside you and expressing it. Making it "easier" is missing the point.
Kay also believes that the drudgery inhibits creativity; which it doesn't. You will be creative even if you have to use a stone and cliff face. Making it easier will not increase your creativity, nor will it improve its quality. If you want to make PCs more use
Re:Deconstructing Kaye (Score:3, Insightful)
Re:Read between the lines (Score:2)
Re:Cue the Apple zealots ... (Score:5, Interesting)
Ok - I'll bite. And I'll bite for personal computing at large, rather than just as an Apple user (which I happen to be, but the below could be achieved on any platform).
The very interesting articles I've digging out recently are on how to play the clarinet. I do use my machines to write music. I quite definitely have my photo albums on the the machine. I'll add video to your list too, and DVD authoring. I'll add web authoring. I'll add accounts - not exciting, but definitely simulating ideas. I'll add communciation - email and video conferencing with friends who are at least hundred of miles away, in some cases on a different continent. In my case, I'll add development and web authoring. And yes, when circumstances allow I sit in my garden and use the 802.11g connection.
I honestly, truly, have no idea what Alan Kay is on about. Generalising the whole of computing on a business knocking out office documents is a bit poor. Then again, the article didn't have much in the way of direct quotes from Mr. Kay - perhaps his main thrust has been misunderstood?
Cheers,
Ian
Re:True purpose of computing (Score:2)
Computers are just tools, some of the most versitile ones that human's have ever imagined. Kay's just upset that the wrench he helped invent turned out to work really well as a hammer, and more people need hammers than wrenches right now.
Re:Please hush up (Score:4, Interesting)
He _knows_ it's a browser, his assertion is that HTTP should have been like WebDAV from the beginning, and that instead of writing a browser, they should have written a browser with authoring capabilities.
The trouble is, that you're looking at the world as it is now, and saying "it's obvious, this is how things should be", instead of looking back and asking yourself how things could be different....
Sure, he's not going to change anything by saying what he's saying, but that doesn't mean it's not worth saying.
Personally I pretty much agree with the overall sentiment - When I was a kid my first computer experiences were with the 8 bit home PCs of the 80s - the ZX81, ZX Spectrum, Commodore (16/+4 and 64) and Amstrad CPC - and every single one of those did far more for me in terms of encouraging my creativity than a modern PC does. Simply because they came with BASIC built in. Programming was what you _did_ and it was so easy to get started. These days the barrier to entry is much higher, and if you look at Windows, it doesn't even come with a programming language any more. At least DOS had QBasic - In fact, Dos with QBasic was almost as good as the 8bit machines...