Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
GUI Software Links

Literacy: Natural Language vs. Code 534

sirReal.83. writes "The Guardian has an article by Dylan Evans, author of Introducing Evolutionary Psychology. The article discusses literacy in computer languages, and suggests that we are in the 'technological middle ages.' Cuddly UI's are the manifestation of wishful thinking; just as we try to make computers to adapt to us, we must adapt to them." Some good points are raised, with the example of the command line interface used, which is a much better choice than, say, an array of switches or a punch card.
This discussion has been archived. No new comments can be posted.

Literacy: Natural Language vs. Code

Comments Filter:
  • by Nooface ( 526234 ) on Saturday November 08, 2003 @10:35PM (#7426813) Homepage
    For a more in-depth discussion of this topic, see Neal Stephenson's essay In The Beginning Was The Command Line [cryptonomicon.com].
    • First off, he quotes the Windows example, perhaps he is unaware Windows is quite powerful with Scripts and Command Lines.

      Likewise, his bias on command lines vs graphics is archaic. Some human / computer conversations are better conducted in text. Others human / computer conversations are impossible without graphics.
      • umm, Windows is VERY weak on the command line. Unless you install Cygwin or some such, the windows command line is barely functional as compared to other OS's. Only the Classic Mac OS is worse for command line use.

        Windows scripting is both worse than the command line and better. By default, Windows Scripting is a joke. There is no real scripting capability. However, 3rd party utility's integrate significantly better than 3rd party command line utilities. But by default, Windws comes in dead last for script
        • by Anonymous Coward
          Longhorn should change this, with the rumoured next-gen command line implementation that approaches Unix-level capabilities.

          Approaches? Are you crazy? MSh goes so far beyond Unix shells its not funny. Rather than the "everything as a stream of bytes approach"[1], MSh will use real objects. It features multi-language support. Objects in the filesystem. Full, multi-directional, conditional pipes. All features missing in Unix shells. Face it, Unix is 30 years old. It needs to be updated.

          [1] Can someon
          • by GileadGreene ( 539584 ) on Sunday November 09, 2003 @02:40AM (#7427638) Homepage
            "everything as a stream of bytes approach"... Can someone explain whats so great about this? Doesn't that encourage ad-hoc interfaces, a poor programming practice?

            Perhaps you should go and browse through Eric Raymond's new "The Art of Unix Programming" book (you can even find it free online). He discusses the stream approach, and why it's a good idea (vs say the object approach). I'm sure others have covered the same ground, but that's the reference I've seen it in most recently.

          • by Alex Belits ( 437 ) * on Sunday November 09, 2003 @02:52AM (#7427681) Homepage
            If everything is a stream of bytes, any kind of interface can be created with it, based on its purpose. Simple interfaces can hide complex structure by just exchanging blocks of data blindly, complex ones can use parsers (a thing that Windows programmers never learn) and implement any format easily. And this clearly separates the programs -- they never ever touch each other's address space and can have any, simple or complex, design of their objects. Also whatever can be done with pipes, can be done with sockets, thus allowing remote and distributed processing without any change to the software doing it.

            Dotnet's "command line", judging by the descriptions that Microsoft made, allows things to just sit in one happy clusterfuck and mess with objects while they are freely floating inside the shell application, something that is in no way different from writing "scripts" in a large spreadsheet. This is yet another example how Microsoft invents 65537th iteration of the mix of their original DLL and DDE, two ideas that they still can't get right after more than a decade of development.
          • Why in the hell would I want OOP at the script level?

            Scipting is supposed to be quick, dirty and robust.

            OOP can be one of those three (Robust). It's the wrong paradigm for scripting.

            And it really complicates the interaction of multiple scripts and/or command line utilities.

            The 'Stream of text'/Everything is a file approach is proven, and ideal for this type of use. 'Everything is an object' is needlessly complicated, and quite problematic for the kinds of use that scripts are for.

            So WSH, while buzzword
    • by mesocyclone ( 80188 ) on Sunday November 09, 2003 @01:32AM (#7427427) Homepage Journal
      It's unfortunate tyhat Stephenson, in that essay, seems to believe that Gates and Allen invented the idea of selling software. It is so typical of the PC generation that people imagine that if the first person to use an idea on a PC invented it.

      A relative of mine was selling software in the early '60s. I worked for a company selling software in 1971. I wrote a command line interface for a teletype in 1969, and first used one in 1967.

      Likewise, I first used saw a hyperlinked GUI presentation at a FJCC in 1967 or 1968.

      As far as the article that started this thread, it is idiotic. It was either done by someone who has no clue about software engineering, or who suffers from recto-cranial insertion. Probably both!

      People have been trying for a very long time to figure out how to KEEP folks from having to know all the dirty little details of computers.

      By the logic of the article, we should also all become logic engineers, and then solid state physicists, and finally wave Shroedinger equations around to understand how the computer REALLY works. Those of us who have done all of that still end up specialists who don't do more than a tiny bit, and those who are not specialists in that area don't need to know it, or even know that it exists.

      I tell folks who really want to know how a computer works to learn assembly language, and then study the internals of an OS. Then they at least understand what a computer *does*. But they still don't know how to build one, nor should they!

      Personally, I use command line for most of my work - cygwin on Win2K for most, Linux for some. But I would NOT want my wife to have to use it, nor my daughter the neuroscientist!

      • nor my daughter the neuroscientist!

        Given that the current pole is Internet dating, may I ask if your daughter is

        a) already attached
        b) under 35
        c) cute

        And no, I've never done this before, honest!
      • Missing the point (Score:4, Insightful)

        by sjames ( 1099 ) on Sunday November 09, 2003 @01:12PM (#7429401) Homepage Journal

        People have been trying for a very long time to figure out how to KEEP folks from having to know all the dirty little details of computers.

        You're trying to carry the author's point way too far (to the point of absurdity).

        By his references to 'Windows', I believe that the author is well aware of those efforts. The point is that those efforts have gone way too far in the wrong direction. Things should be simplified as much as they can be and no more.

        Most 'user friendly' GUIs way over-simplify to the point that simple tasks on the commandline become nearly impossible and horribly time consuming. Haven't we all seen people do open-search/replace-save-close over dozens of files because they don't know how to use for and sed on the command line (and thus, turn a 10 second job into an all day adventure)?

        The real argument is one of basic literacy, not mastery. We don't expect everyone in the western world to be a literary genius or to know how to design, build, and run a printing press. However, we do expect them to be able to read and write. In the same way, most people can drive a car even if they can't design one or even rapair one. Many who can repair a car can't design one from scratch. In the same way that we don't hire a driver whenever we want to travel by car, and don't hire a scribe when we want to write a grocery list (and then a scholor to read it back to us as we shop), we shouldn't have to hire a programmer just to interact meaningfully with a computer.

        Just as we don't avoid the scribe by drawing little pictograms of the foods we want to buy, we shouldn't avoid the programmer by using little pictograms as a replacement for knowing how to make a computer do what we want.

        There are people who get by without literacy in the west, but I don't think anyone trys to deny that they are seriously handicapped nor do they propose that the solution is to eliminate the need for literacy to fully participate in society (Yes, there are some efforts at pictographic signs, but those are mostly for people who are presumed to be literate in a foreign native language).

        It's not a matter of expecting everyone to learn shell scripting right now, it's more a matter of expecting that simple scripting and programming will become a basic educational requirement just as literacy went from being an archane knowledge laboriously learned in early adulthood by a few monks and other scholors to being a basic skill we all begin learning before age 6. Efforts to the contrary are a waste of resources.

      • As far as the article that started this thread, it is idiotic. It was either done by someone who has no clue about software engineering, or who suffers from recto-cranial insertion. Probably both!

        According to the article, the author cowrote an essay on the Matrix and evolutionary psychology.

        I guess that proves your point...

        But I would NOT want my wife to have to use it, nor my daughter the neuroscientist!

        I agree, but not because the command line is inherently hard.

        For example, it's great to be able
    • by Frater 219 ( 1455 ) on Sunday November 09, 2003 @01:40AM (#7427443) Journal
      What Evans' article suggests -- a conclusion, I might add, that I would not attribute to Neal Stephenson -- is that understanding computation has become necessary to individual freedom in our computerized society. That is, because we have to use these immensely complex and versatile artifacts, we must understand them and be able to control them, in order to call ourselves free. If we cannot control them, it follows, they will be used to control us.

      In the olden days, there was an expression used to refer to those disciplines and sciences deemed necessary to the free man. That expression was the liberal arts. Though today we might associate that phrase with endless "humanities" classes, or with a college degree not useful for any particular career, of old it meant simply those arts -- practices -- necessary to exercise the liberty of a free citizen. The classical liberal arts were seven: grammar, rhetoric, logic, arithmetic, geometry, astronomy (for which read "physics other than ballistics"), and music.

      (Please note that literary criticism, social theory, and deconstruction are not named among the liberal arts.)

      We still recognize (I hope) that one who cannot recognize a fallacy in argumentation, or who cannot do arithmetic, is severely impaired in exercising the freedoms of man and citizen. A person who is unacquainted with works of literature may miss cultural references in a politician's speech, but a person unable to cope with rhetoric and logic cannot even tell if the speaker is contradicting himself. Likewise, one who cannot add and subtract cannot tell if he is being cheated in the marketplace.

      From a classical viewpoint, what Evans is suggesting is that an understanding of computation has become a liberal art: a discipline necessary to exercise freedom. It is unfortunate and misleading, however, to frame this in terms of "programming languages" or "command lines" -- both of which are simply abstractions (just as is the GUI) on top of the mathematics of computation. The essence that must be understood is no language other than mathematics.

      (As an aside: Historically, computer science -- which has little, I might note, to do with "knowing programming languages" -- is an outgrowth of mathematical logic, which is itself an extension of the liberal arts of arithmetic, geometry, and classical (syllogistic) logic. Thus, Aristotle and Dodgson, to pick two, prefigure Turing and McCarthy.)

      The same fundamental calculi of functions, algorithms, and Boolean binary logic underlie all of the abstractions we may encounter in computing. GUIs, shells, assemblers, virtual worlds -- all of these are necessarily founded upon the same mathematics. No matter how complex the language or how pretty the interface, it must abide by mathematical logic or it cannot function.

      Thus, if Dylan Evans seeks, with Neo, "the code behind the graphics", he should not look to the Unix shell, to C, or even to machine code to find it. Those are tools, not truths, and freedom comes with understanding truths, not simply with mastering tools. Learn the liberal arts -- mathematics and logic -- and you will be much better prepared to defend yourself as a free citizen in a computerized world.

      • by Ignis Flatus ( 689403 ) on Sunday November 09, 2003 @04:22AM (#7427826)
        Learn the liberal arts -- mathematics and logic -- and you will be much better prepared to defend yourself as a free citizen in a computerized world.

        Nah, to be "free" in our society means having power. And for at least a couple of centuries now, the easiest way to obtain power is to become a lawyer. Learn to defy mathematics and confound logic through the power of law.
      • by Dr. A. van Code ( 143149 ) <d_r_conrad&yahoo,com> on Sunday November 09, 2003 @04:57AM (#7427888) Homepage
        Unfortunately few U.S. citizens today can recognize a fallacy in argumentation. In American politics debates are usually won by whoever can yell the loudest, beat the most strawmen, scatter the most red herrings, poison the well the most effectively, do a better job of making ad hominem attacks stick, or lie the most convincingly. We desperately need more of the classical liberal arts.

        I agree that understanding computation has become an indispensible skill and should be considered one of the modern liberal arts. But I must take issue with your argument that a grounding in mathematics and logic is all that is needed.

        Programming languages and command lines may be only one form that the underlying math behind computers takes, but learning them teaches important lessons about how information is represented and structured in computers. Of course, there are other approaches that will bring about the same understanding. But focusing on the pure abstract math behind it all without ever getting down to the nitty-gritty implementation details will never give anyone the skills needed to survive in the modern world.

        One example should suffice to demonstrate the point. This evening I helped a friend with some computer problems he was having. One problem was an unwanted program that was running every time he booted up. A few moments with regedit quickly cleared up the problem. I'd like to think I'm fairly well versed in math and logic (yes, even Boolean algebra), but no amount of general knowledge would have been enough for me to solve the problem at hand, and, more importantly, it wouldn't have been enough for him, either!

        I knew my way around regedit and the Windows registry; he didn't. Let's have more (and better) education in logic and mathematics (statistics is also vital but too often overlooked), but let's also teach the tools and the practical application of the concepts. Otherwise we'll just produce computer idiot-savants.

      • Great post.

        I think I'll link to How To Design Programs [htdp.org]. It's a book from MIT aiming to teach computing as a liberal art. The preface has a nice discussion about why everyone should learn how to program - much nicer, IMHO, than the Guardian's article.
      • by The Cydonian ( 603441 ) on Sunday November 09, 2003 @06:45AM (#7428120) Homepage Journal

        You seem to be, quite well I must say, arguing for freedom from tyrants who are other people (or people-like beings), while the guy seems to be arguing against the tyranny of the machines (more precisely, the tyranny of the act of controlling machines). Which is not to say you are wrong in your summary, of course; just that we're talking sliiiiightly different things.

        Which is exactly what I find this surprising here, since I always thought The Matrix was all about this. The author, it seems, is to be unable to understand the concept of abstraction; to become a truly powerful (not necessarily great) programmer, you don't need to figure out how your program works in details, just as, to make a satisfying omelette, you really don't have to know the chemistry behind an egg. Fact of life, people, abstraction strikes at the root of all civilisation, although you're right, to maintain or extend civilisation as we know, we need other tools such as logic.

        His other general point, however, is well-taken; programming is, to be sure, fast becoming critical for career in any scientific discipline, but for reasons other than those he mentioned. Among other things, agent-based modelling (pun wholly intended) is the Shiny New Paradigm (tm) these days in social sciences, and obviously, you need programming for swarm intelligence and all that.

        But no, you don't -- shouldn't -- need to be able to program your CMOS for that.

  • sigh (Score:2, Insightful)

    by russellh ( 547685 )
    sigh. where are you, ted nelson?
  • by Anonymous Coward on Saturday November 08, 2003 @10:39PM (#7426828)
    Microsoft is oppressing the masses with the GUI. Everyone must learn a scripting language in order to manipulate information. Suuuuuuure.

    Who the hell wrote this article, the union of all slashdot posts?
    • by hellswraith ( 670687 ) on Saturday November 08, 2003 @11:13PM (#7426960)
      I can't agree more. This about the most stupid article I could have ever hoped to see on this topic. I am sure some short sighted person said such things about the automobile when it first arrived. Probably something like "If you don't understand how your car works, and can't fix it yourself, you will be walking everywhere."

      Give me a break. Programming has become such a complex subject, that there is no way the majority of users could ever hope to achieve the level of proficiency needed to even code the simplest of applications. It takes a majority of coders 5-10 years to become 'experts'. That is why there are application developers, and the users they write applications for. If we (the application developers) do our job right, then we can satisfy the users needs without them having to know how to write code. This lets the users concentrate on using the applications that were written to perform other business needs that they will spend their time studying.

      To prove my point that this will never happen, I have an example. Out of the 40 people that started a Java class in my college, only 6 of us finished it. 34 couldn't keep pace and couldn't understand it. The class wasn't that hard. One chapter a week, and one little app a week to re-enforce the chapter's materials. How is 'everyone' going to learn programming if that many can't hack a beginning class?
      • I had a boss with the same line of thinking. He always went on and on about how that was the reason he used Linux and not Windows, because for some reason if you can't do something in the most complicated way possible, you don't "truely understand how computers work" and are therefore stupid. Or something like that.

        As you can infer from that, he was a complete asshole in general, which is why I quit. :D
        • Since you quit anyways, you should have snuck into his office once or twice and cut the cord off his mouse. Why should he need to use a mouse?

          If you had the resources at hand (probably not) you should have pulled his PC entirely and wheeled in a mini-computer with a switch panel and lights, like a PDP-8. Why should he use a computer that boots automatically? It isn't that hard to memorize the five or six octal command sequences to bootstrap a PDP-8 and get it started reading the high speed paper tape to
        • Too bad you quit. You might have learned something new and valuable by listening to your boss, asshole or not. I'm the only one at work who predominantly uses Linux at home (2k for HalfLife only) and am constantly consulted at work for solutions to Windows problems, including from the GUI-clicking IT support staff.
          • That's not to say I don't know how to use Linux or the shell prompt (hint: I run a Linux server), but if someone constantly hounded you about driving a car with an automatic transmission, citing its "easiness" and your inability to "truely understand how cars work" since you don't drive a stick... well... you see where I'm going.
      • 200 years ago do you think you could have taught English reading and writing to a bunch of farmhands with no stake in actually learning it? Today everyone can read and write, because it's nearly impossible to get by without it.

        If, as the article says, in the future writing code will be nearly as important as writing English, then like English, programming will be taught to our children. So a high school graduate will already have about 10 years experience with it.

        Not that I believe it will happen. (th
        • by Daniel Dvorkin ( 106857 ) * on Sunday November 09, 2003 @09:04AM (#7428391) Homepage Journal
          I'm always amused by people with no sense of history throwing around phrases like "200 years ago." Apparently you just picked a number that seemed like "a long time" and put it up on screen. If you'd said 500 years, I probably wouldn't have taken issue, but ...

          200 years ago. 1803. The US -- a nation which had come into existence, to a large degree, because of its large literate peasant population -- was just getting on its feet. The beginnings of reform (extension of the franchise, etc.) were taking shape in Britain, because for the first time, the farmers and the growing industrial working class were demanding it -- and they did so because they read. (It would take a few more decades for this to come to fruition, granted.) France was in the middle of the turmoil of the Napoleonic era; say what you will about the guy, but he tore down the ancien regime's policy of keeping the peasants ignorant, and set France on the road to democracy. In fact, of the great powers of the day, only Russia was able to stifle the urge of its unwashed masses for knowledge and freedom ... and a large part of the reason why today, the US, Britain, and France are proseperous, stable democracies while Russia is essentially a giant Third World country with nukes can be traced back to this.

          In short, the great story of the early 19th c., at least in the West, is the "rise of the demos," for the first time in history. And it happened because "a bunch of farmhands," all over the world, wanted to educate themselves. Never underestimate people's ability to see what it will take to build a better life for themselves and their children.
      • "If you don't understand how your car works, and can't fix it yourself, you will be walking everywhere."

        Thats a bad analogy.
        A car enhances your muscles: you should never need to understand it anymore than you need to understand your legs. (a little practice and go, right)

        A computer can be used for many types of jobs.
        Some of those jobs could require little to no understanding of logic.

        For thinking jobs however, you will need some competence to be good at them. In this case the issue is not so much "und

      • by mcrbids ( 148650 ) on Sunday November 09, 2003 @02:50AM (#7427672) Journal
        To prove my point that this will never happen, I have an example. Out of the 40 people that started a Java class in my college, only 6 of us finished it. 34 couldn't keep pace and couldn't understand it. The class wasn't that hard. One chapter a week, and one little app a week to re-enforce the chapter's materials. How is 'everyone' going to learn programming if that many can't hack a beginning class?


        The strange thing is that computer programming is getting both hard and easier at the same time.

        Things that were traditionally difficult are now easier than ever. However, the things that are expected of computers today make yesteryear's problems pale.

        As languages evolve new capabilities, expectations rise to meet them - and the net effect is that the power of computing never really makes it to the average Joe.

        This is news?
        • by gad_zuki! ( 70830 ) on Sunday November 09, 2003 @09:57AM (#7428553)
          >Out of the 40 people that started a Java class in my college, only 6 of us finished it. 34 couldn't keep pace and couldn't understand it.

          Well, look at how programming is taught today compared to the late 80s.

          When we were taught BASIC on machines that were only command line, people were quick to pick it up. Why? Because you could actually do stuff with it. At the time writing an app that did stuff like simple math, outputted to the printer, displayed a simple ASCII graph, etc. This is what computers did and being able to do this with a few lines of BASIC code was actually empowering.

          Now look at what computers do now: everything. And they do it with neat-o GUIs. Also, many commercial apps are written so you don't have to program. You don't have Ford telling you that you should be popping open the hood on a brand new car and doing some major work do you?

          Anyway, walk into a Java class. First thing they teach isn't how you can use java to solve problems like sorting text files, etc they throw the bible of OOP at you. OOP is fine and good, but if you don't have some procedural experience under your belt and know your way around at least another language Java is just going to be a mix of OOP, classes, etc and other junk a lot of people are not going to see how it all connects to their everyday tasks. Even if you master Java you're writing horribly slow apps designed for cross-platform applications. Not exactly empowerment there. Sure, you can move to any language from there, but starting to learn programming with Java is like kick in the teeth.

          I think Dylan should have focused on how empowering HTML, Javascript, and PHP are. After reading a book or looking at a few examples you can quickly get the gist of HTML. Same with javascript. The stuff runs, it does stuff, you can show it to your friends, etc. Shift to PHP are you're doing tons more stuff, while your Java programmer is fighting his or her through a complex language with a strict syntax (at least a lot more unforgiving than HTML or PHP).

          If there's a lesson here its embrace modern tools that accomplish something. Moving back to the command line is silliness for most people as they never leave the GUI and don't expect a CL program to be of much use. Giving them the power to generate GUI-like apps through HTML, etc is much more useful than spending 18 hours learning how to use cat, emacs, pipes, uniq, head, tail, etc on Cygwin.

          Its a web/GUI world. This is what people should be adopting to. The days of simple DOS-like programs are far behind us and a lot of scripting tasks can be done within robust GUI apps.
    • by Spyffe ( 32976 ) on Sunday November 09, 2003 @03:47AM (#7427767) Homepage
      I agree that it is not the GUI per se that is the instrument of oppression. It is, instead, the adoption of mediated tools to access information.

      For instance, when I want to find something on the Internet, I use Google. I trust that if I type in two words, Google will find me Web pages that contain those two words. I have no idea how they do this, because they keep it secret.

      This is the dangerous thing. The Merovingian (watch Matrix II Reloaded!) would love it: I type in words and I get links and I click them, with no idea of why I got them!

      Nowadays, that's reasonable (although Google is already starting to remove links that are extremely unpopular or expose them to lawsuits). But in the future, Google's mediation of my interface to the Web could really hamper me.

      If at that point, I continue to use Google with no understanding of how to spider the Web for myself, I'm screwed. My searches will be controlled by Google, and I will be jacked in to their particular Matrix, seeing only information they choose to purvey.

      Similarly, right now I use a PowerBook for everything. I have a Japanese DVD (Spirited Away) that I want to watch. But I can't, because the firmware in my DVD-ROM drive locks me out. I have allowed another company to mediate my experience of the data on the DVD. They have chosen to limit what I can see, and because I don't understand their hardware (i.e. I can't reprogram their BIOS) I am at their mercy.

      The GUI can be a powerful tool. It can enable one to visualize what is going on in an extremely detailed fashion. But if I don't know what's being visualized, and what simplifications are being made, and how they're being performed, I'm screwed if I want to do it any differently.

      Have you watched Serial Experiments Lain? It is a Japanese animation about a little girl that is slowly sucked into the world of the Internet. In her school, they learn programming and she has a textbook that describes the architecture of her computer. In effect, this is what all who want complete self-determination need: a textbook that tells us how the tools we use to process data do it.

      This makes non-self-determination an attractive option. Most people will simply choose to take what they are given and to hell with how it's processed or from what source. They will eventually end up looking at a data-feed and occasionally clicking on interesting bits of information. This may be a satisfying way of life.

      Then there will be the Merovingians, holding all the keys. They will understand the workings of the data-feeds and will, through subtle manipulation, be able to tap the vast computing power of the hardware that underlies them. They will also control all the drones.

      The fight is happening right now. The media companies are the Merovingians, and consumers of media are being herded into smaller and smaller squares. Some will squeeze out of the barriers, and form a Zion of resistance, of hard-fought lives on the fringes of the information society. Some will join the Merovingians. But most will enter the Matrix.

      As Roac son of Carc would say, I will not say if this be good or bad. But I will say that I want to be in Zion.

      • The Matrix taught me many important things about Religion and Philsophy. For example, Jesus knows Kung-Fu and shoots people. And... um... spoons aren't real.

        I swear, next time someone says "There is no spoon" I'm going to throw a spoon at them.
  • Inq (Score:2, Informative)

    by heli0 ( 659560 )
    Here is the Inq writeup on this from a few days ago: Man that inspired The Matrix reckons we should all learn assembler [theinquirer.net]
  • still amazed (Score:2, Interesting)

    by malus ( 6786 )
    that the customers I write software for 'work' on their computers 8-10 hours a day, and still have very little 'understanding' of what they are doing.

    These users, which I can only imagine are representative of most computer 'users', don't really care. They simply look at their keyboards, mice, and monitors, and think, "I don't need to understand what I'm doing, all I know is that I need to do this, that, and perhaps another thing, and voila! A paycheck every 2 weeks!"
    • Re:still amazed (Score:3, Redundant)

      by dreamchaser ( 49529 )
      That's exactly how it should be. A computer is just another tool. Do most people understand what goes on inside their microwave? No, they just push the buttons and stuff gets hot.
  • by MongooseCN ( 139203 ) on Saturday November 08, 2003 @10:45PM (#7426852) Homepage
    Instead of typing in stuff like:
    int main()
    {
    printf("Hello World.\n");
    return 0;
    }
    Now you simple type in:
    Microsoft_knows_whats_best_for_me;
  • Fantastic Article (Score:3, Insightful)

    by Beg4Mercy ( 32808 ) on Saturday November 08, 2003 @10:51PM (#7426874)
    The author poses the question of "Would you know what to do if you were left staring at lines of letters and numbers of HTML?" to which I (and most of Slashdot, I suspect) answered YES! Then I found out since my answer was yes, that I am in a minority! Awesome!

    I am a Computer Science major at MUN and with the reduced spending and reduced high-tech jobs my greatest fear is that I will not get a job in a couple of years when I finish my degree. I, along with many other SlashDot readers, might become obsolete.

    This article takes the opposite viewpoint -- it emphasizes how important computers and technology are to our future. I particularly liked the article telling us that everyone else is in a dark age. :) This is simply a case of somebody "telling me what I want to hear" and I love it! :)
    • I'm going to have to disagree with the article on this. As a CS major at Harvey Mudd College [hmc.edu] (one of the most technology-oriented colleges in the country), I have already learned not only assembler, but how to build computers starting with transistors and wires. While this is fascinating (I have a whole new respect for anyone who writes an operating system), I don't think it is at all necessary for my future. The entire reason that grammars and parsers (For those of you who aren't really into CS, these are
  • If it is also necessary to spend hundreds - no, thousands - of hours at the keyboard, how are our scientists, the ones that aren't only interested in computers, supposed to get the time to study their own respective fields? I mean, thousands of hours takes up a bit of time, but I don't think we want nuclear physicists or molecular biologists putting their own fields second. I mean, imagine if your surgeon were a little tired from trying to figure out the slashcode he just installed on his home computer?
  • pointless article (Score:3, Insightful)

    by hankaholic ( 32239 ) on Saturday November 08, 2003 @10:55PM (#7426885)
    Basically, the article says:

    GUIs don't map very well to the way that computers actually function.


    Within 50 years, the average secretary will need to know how to program or will be unable to perform his or her job.
    Those aren't direct quotes, but that's the meat of the article.

    Imagine that in 1930, somebody said that the controls presented to drivers don't map well enough to the function of cars, and that in the future people would have to know how every drivetrain component works in order to drive or face losing the ability to use public roads.

    You don't have to know how a VCR functions in order to use it. You don't have to know how your cell phone transmits signals in order to use it. You don't have to be an engineer or know how a torque converter works in order to drive with an automatic transmission.

    I don't see why an article that states that users will have to know how to code in order to use computers is worth a spot on the homepage.

    Am I missing something here?

    • Cars don't handle your personal, finanical, and various other types of data. You need to know how to use a computer to a degree to control, maintain, and extract new information from it.

      Cars, VCRs, and telephones only do one thing and that's it. Computers are very much general purpose. In fact years from now you can explain what a telephone operator was to your child -- they'll most likely be using some type of VoIP replacement. Why? Well if you don't know how it works in the least - how do you know
    • by nels_tomlinson ( 106413 ) on Saturday November 08, 2003 @11:27PM (#7427005) Homepage
      Imagine that in 1930, somebody said that the controls presented to drivers don't map well enough to the function of cars, and that in the future people would have to know how every drivetrain component works in order to drive or face losing the ability to use public roads.

      Am I missing something here?

      Maybe you are.

      The car, in 1930 and today, is a simple, single-purpose artifact. However complex it may be under the hood, it goes places and takes you along. The driver needs to steer, and control speed. That's it. To suggest that the driver CANNOT use it without being able to understand, repair and adjust every component is pretty silly.

      The computer is a non-specialized, multipurpose artifact. A programmer can make it into a very expensive word processor, or a very expensive ledger, or a very expensive sliderule, or a very expensive map, or ... To suggest that the operator must be able to provide at least some of the instructions the computer needs, in order to make full use of it to accomplish his job, doesn't seem entirely silly.

      Any job which requires no creativity (for want of a less fuzzy word) can be done by a computer without any human intervention. For example, if you are simply entering data and running programs A, B and C, a better system could enter the data and run the programs without you.

      I would say that any worker using a computer who can do his job without doing ANY programming could be replaced by a slightly better program than the one he is ``operating''. The only exceptions would be people doing jobs which are wholly creative, and could be done without a computer at all (e.g., writers, who could use pencil and paper).

      Furthermore, the complexity in a car is not irreducible. A battery, an electric motor, some wheels ... it would be possible to make a car that the average driver could understand. There is nothing there beyond the moving parts. The car is not valued because of its complexity, but because it gets you places.

      In contrast, the complexity of the computer is irreducible. Even if it were physically simple and comprehensible (and really, it is), the software is arbitrarily complex, to the extent that we have a new field of science to study it. It's is this complexity which makes the computer valuable.

      • The computer is a non-specialized, multipurpose artifact. A programmer can make it into a very expensive word processor, or a very expensive ledger, or a very expensive sliderule, or a very expensive map, or ... To suggest that the operator must be able to provide at least some of the instructions the computer needs, in order to make full use of it to accomplish his job, doesn't seem entirely silly.

        I dunno. I think that's perfectly logical. If I want a wordprocessor, I shouldn't have to do more than

      • You seem to assume there are only three types of jobs, those that could be done by a computer, those that involve programming computers to do the first, and those that are "wholly creative, and could be done without a computer at all."

        In fact, only a small number of trained professional jobs could be done by computers. Computers extend the ability of a single person, such that fewer people are needed for the same job, occasionally, but they are not therefore replacable entirely.

        Businessmen who make de

        • You seem to assume there are only three types of jobs, those that could be done by a computer, those that involve programming computers to do the first, and those that are "wholly creative, and could be done without a computer at all."

          But many jobs involve the use of computers as an implement, to communicate, to do word processing, to recieve information, to make trades, orders, etc, but could not be done by a machine. That is the distinction, that while plenty of jobs need a machine, they could not be do

    • Others have addressed the specific point, but as for your claim about cars:

      that in the future people would have to know how every drivetrain component works in order to drive or face losing the ability to use public roads.

      Pretty much true to keep your car on the road unless you want to either a) pay LOTS of your hard earned money to someone who does, or b) take some time to understand what the basic components are and how they are maintained.

      If you don't know that your drive train needs to be inspe

    • Am I missing something here?

      Yes.

      A car can have a consistent user interface because the functions that it performs are specific and limited. The same goes for cell phones and VCRs. That's also why cell phones, VCRs, and microwaves turn on instantly instead of having to boot up; Their functionality is usually fixed.

      Computers, on the other hand, are multi-purpose machines that people want to use to process information in lots of different ways. How you want to process your information will determi

    • I think you are missing the point that many intrinsic parts of programming are hard to learn. Some people will never get recursion. Some types of tasks cannot be done reasonably without recursion. So, I claim, some people will never be able to do some kinds of programming.

      Another problem is that English and other natural languages are terribly vague.
      Human: "Computer, add the first 50 numbers".
      Computer: "Should I start with 0?"
      Human: "Of course not, 0 is not a number!"
      Computer: "Hokay, if you sa

  • by windows ( 452268 ) on Saturday November 08, 2003 @10:56PM (#7426888)
    The article suggests that machines deal in text instead of through colorful GUI windows. This isn't true at all. The computer has no preference between user interfaces; it doesn't make a bit of difference to the machine.

    Whether or not I'm adding a switch on the command line or checking a box in some GUI, I'm performing the exact same function - that is, toggling some flag/setting within the program. It's just a different representation. The article suggests that text is the language of computers. This is not true at all. The language of computers is a stream of octets that are interpreted as instructions by the processor. That is the only language the computer actually understands.

    I can say for sure that I find the GUI very efficient at times. For example, I do some video editing and converting, and find myself using mencoder (a tool included with mplayer) rather often. There's a LOT of switches at the command line, and often I find myself spending several minutes browsing the manual page to find what switches I need set. And even then, sometimes I find myself turning to Google to find the information I need. I can't help but think that it could be done much more efficiently with a very basic graphical front-end. The CLI isn't always more efficient.

    I know, there's many tasks that are better done from the command line. But to say that a user operating a GUI is further removed from the internals of the computer, is just incorrect. Whether or not I'm adding a switch on the command line or checking a box in a GUI, it generally has the same effect.
    • sure GUIs are very efficient at times. just like pictures are efficient for cartoons.

      what you have just pointed out are the only areas of computing that are better served by GUIs. i'm talking CAD, photo manip., CGI (even that is debatable since scripts actually do the bulk of the work) etc. because they deal with pictures.

      don't get me wrong GUIs do have a place. just like pictures do. but if pictures were really worth a thousend words (enough to replace them altogether) why do we still carry around books
    • So how do you do loops and conditionals through a GUI? Computers exist to do stupid and repetitive things, yet most GUI apps don't give you a flexible way to take advantage of that. You can only do the tasks the programmer intended. To do anything else, you need to be a programmer, and that usually means a text-based programming language.
  • by GnuHaiku ( 649058 ) on Saturday November 08, 2003 @10:56PM (#7426889)
    The suggestion that people should use CLIs instead of GUIs so that they can understand how their computers work at a fundamental level seems kind of ironic to me. CLIs were originally introduced as just another layer of abstraction. When you type "ls", you don't really think that you're sending the command directly to your CPU, do you? The command shell processes the text that you input, interprets it, and cranks out a result (I oversimplify, of course). Even your file system is just another level of abstraction, as is the C or C++ code that you type in to be compiled. On the other hand, additional abstraction can simplify user tasks tremendously and make learning curves much shallower. Try writing a "hello, world" program in ASM^H^H^H octal, and then in Perl or Python or C or java or whatever, and see how much easier it is!
    • That journalists, even "technical journalists," know jack shit about computers. Let this guy start hammering out all the x86 opcodes one by one then tell me his article makes sense...
    • ...because all they understand are 1 and 0, and pretty much everyone understands the concept of 1 and 0.

      Anything other than 1 and 0 is a pure human construct. So when we're saying that "computers are hard to understand", we're really not saying that computers themselves are hard to understand. What we're really saying is that the humans who design the computer software are hard to understand and the messed up way they think are hard to understand.

      A geek is nothing more than someone who has great trouble s
  • by SamNmaX ( 613567 ) on Saturday November 08, 2003 @10:56PM (#7426892)
    The article doesn't really give a reason that users should learn languages. Their only reason is being 'stuck in the Matrix'. Gee, thanks...

    I'm not sure the average user need to start cracking open books on Java (or even VB). Yeah, as a programmer I take great interest in how the computer works, and it probably makes me more productive. However, I think I was very productive with the computer without actually knowing any programming languages that well.

    The key to being productive, after understanding the basics of the computer in terms of memory, files, etc., is tools. How do you search for text in files (grep, find)? How do edit HTML files (text or GUI based program)? How do you move files around? (samba, ftp, etc.)

    It may be useful to at least be able to wrap your head around something like a regular expression, though even being able to understand what "*.txt" means is nearly as useful. For the adventurous, a scripting language. I don't think any more, at least given the current tools, is that necessary. Making a full-fledged program is hard work, it takes time. Most tasks you may think require programming are already be implemented.

    Obvoiusly, computers shouldn't be made purely for those who have no patience to learn. However, there is a balance, and everyone knowing assembler Java, or even HTML isn't it.

  • Total Nonsense (Score:5, Interesting)

    by Ignis Flatus ( 689403 ) on Saturday November 08, 2003 @10:56PM (#7426893)
    This type of nonsense always comes from programmer geeks, too. Face it, the computer is a tool, not an end in itself. Sure, it'd be nice if every school child could write perl and understood regular expressions, but why? I'm sure most of you can drive a car, but how many can rebuild an engine? Can you do a brake job? Sure, being a mechanic in the height of the industrial age would have given you a financial advantage over your peers, but in the end, the automobile is just a tool that gets you from point A to B. The same is true of computers, it's just a tool. If I'm say, a theoretical chemist, why would I need to understand how to get under the hood of my operating system and tinker with it. It's just a tool. I might be interested in some scripting language that my chemistry visualization or analysis programs use, but for the most part, I shouldn't have to tinker with my computer. I should be able to put the key in the ignition (login) and it should work. If it's broken, then I take it to the technician and let her get under the hood.
    • yup, the author wasn't thinking too hard or clearly. How many of us can program in the assembly langauge of the chip that's in our cell phone, microwave oven or car (I don't even know nor care what CPU is in those!) Even those of us that are in front of an Intel or AMD box, how useful is it to most of us to know x86 assembler? Some C/C++/Objective C programmers might find that useful, but for those of use who write Visual Basic or Java or Perl or Python or P/SQL it's .....utterly useless!
    • Maybe, but perhaps you're underestimating the importance of computers.

      Consider, children spend 12 years of their lives in mandatory education. To learn essential, and very complex, skills like writing, reading, mathematics, and analytic thinking. People could in theory get along just fine without knowing these things. But the benifet of teaching these to our children so outweighs the cost that we go to great lengths to do so.

      None of these things are fundemental. They come and go. Once, a detailed grasp of
  • Good Analogy (Score:4, Interesting)

    by LuxFX ( 220822 ) on Saturday November 08, 2003 @10:58PM (#7426899) Homepage Journal
    An interesting concept. And this is exactly why Open Source software should be promoted.

    The spread of human language has been an accumulative process. After the middle ages, when more and more people became literate, there was a corresponding increase in writers. The more writers, the more literature was available, which generated more ideas for more literature. It built on itself. Literature was Open Source. Anybody could take existing material and take ideas from that to build more material.

    When we come to a similar stage with comptuers it will be the same thing. Programming will no longer be for the scholars, and more and more people will begin to take part. And the more software in the collective existence, the more resources there are to build more software. But it needs to be Open Source to facilitate the accumulative process.
    • I think you're dead wrong.

      Just as when cars were first introduced, everyone needed to know how the magneto worked in order to be sure they could get it started. There were few mechanics, so those who dealt with cars frequently had to be adept at rebuilding components and such.

      Can you say that now, 100 years later, that every driver on the road is MORE adept than those early pioneers?

      When the functionality becomes transparent, people don't even KNOW they are using a PC. Do you know how many computers yo
      • Why the obsession with comparing cars to computers? There seem to be a lot of responses on this article along the sames lines.

        A car only has two or three main purposes. With cars, the most advanced/successful user is either the safer or faster driver (depending on road or track). The simpler and/or more efficient method of translating the desires of the driver, the higher the success rate of the driver. The desires of one drivers are not that different from the desires of another driver, because the tas
    • just to point something out about my reply. I was addressing the comment "programming will no longer be for the scholars".

      The truth is that there WILL be more capable programmers in the future as the technology increases, but that is simply because the breadth of "scholars" in the area will increase, not because "Joe Secretary" will learn to program in order to understand his email system better and start hacking Mozilla Mail as a result. :-)

      Stewey
    • but for the 'common man' to program, we will have to evolve to something more natural than C or C++. something that doesn't focus on the language, and lets the programmer just fricking program. Maybe in the future iterations of Java, or a more natural scripting-like language of Applescript(doubtful, but who knows) we will find something that lets program what they want.

      A look forward to the 'age' (a few years off, given today's teen/20's being so computer savvy) where computer compatince is the norm, rathe
      • I think we're already seeing the first stages of that. Word and Excel, for example, come with highly integrated VBScript. There is lots of room for improvement, but the fact remains that millions of people have a scripting language at their fingertips, they only need the necessity to learn it.

        I think that necessity will come when so many people begin using computers for so many personalized tasks that software companies simply can't keep reasonably create every option for everybody. And again, I think V
  • I don't buy it (Score:2, Interesting)

    Things can only get worse. As our society becomes ever more dependent on information technology, the gulf between those who understand computers and those who don't will get wider and wider. In 50 years, perhaps much less, the ability to read and write code will be as essential for professionals of every stripe as the ability to read and write a human language is today.

    I wonder if he would have said the same thing about cars 75 years ago? As we get further into this new technology, everyone will be dri

  • It doesn't really matter how you display something, whether graphically or textually, but how you represent something. CLI works because it represents everything as a file upon which commands act upon. This is why I think that file managers suck. They throw out most of what is good about CLI, many commands and options to commands, and keeps the files. GUI isn't bad when done correctly or at least consistantly, like NextStep (caps?) or MacOS but when you have redundancy and things that you have to change
  • Ridiculous (Score:5, Insightful)

    by dreamchaser ( 49529 ) on Saturday November 08, 2003 @11:05PM (#7426921) Homepage Journal
    As has already been pointed out in other posts, computers do NOT have any preference for text. All he is doing is spelling out HIS preference for text and projecting it as a need for everyone to learn 'the language of the machine'. Does he realize that the language of the machine is just a streams of 1's and 0's? Apparantly not. I can't believe anyone published this drivel.

    Making a tool more accessible for the masses is exactly what should be done, and is the normal progression for any technology. Perhaps he thinks that we should program our VCR's by setting dip switches, or reprogramming it's code just to catch the latest episode of The Simpsons?

    Yet another example of someone with far too much time on his hands to think coupled with an amazing lack of common sense.
    • Ones and zeros? Does your computer care if you call it a one or zero? Nope--I'm shouting ZERO at my computer quite loudly at the moment but it doesn't seem to understand. Bah, another abstration. It's just transistors in one state or another. Voltages, little itty bitty electrons (is there any other kind?) zipping around from here to there.

      It's amazing that the damn things work at all.
  • It is pretty interesting, how commandline interface seems to be getting a lot of good press lately.

    I heard that Microsoft was mandating that all major functions in its next generation operating systems to be available from the commandline, or something like that.

    Or is this some kind of cyclic phenomenon that comes (CLI gets praised to the skied, then the euphoria dies down, and the GUI is praised to the skies, then the euphoria dies down and so on.)

    I suppose things would get pretty interesting.

    Personall

  • by jd ( 1658 )
    Guis are about laziness and cowardice? These are remarkably tough words from the Computer Guardian section, which has always been a dwevout worshipper of GUI systems.

    What's changed?

    I don't know - I don't have any mind-reading devices handy - but I'm going to guess they have used, or witnessed someone using, a command-line system utilizing every scrap of the machine's capability.

    With command-lines, you can usually do this. With GUIs, you're often inhibited. Although it should be perfectly possible to G

  • CLI vs. punch cards (Score:3, Informative)

    by schmaltz ( 70977 ) on Saturday November 08, 2003 @11:11PM (#7426953)
    % CowboyNeal | schmaltz >>/dev/slashdot 2>&1
    Some good points are raised, with the example of the command line interface used, which is a much better choice than, say, an array of switches or a punch card.
    Well, the CLI is essentially a direct descendent of the punch card. Back in them olden days, yer Hollerith cards was how you got a stream of data into a system for processing -mighta be program, mighta been data, or both! The output of your input was generally a fanfold pile of scrim.

    On antique iron you might be running fortran, cobol or BAL proggie which got compiled and executed, but today thru the CLI your input stream could be a perl -e script, data, commands strung together for serial execution, whatever. The big difference is that now you usually get feedback immediately following your pinky whacking the enter key.

    So, while it's improved quite a bit (especially turn-around time), it's not a huge evolutionary leap from punch card to CLI.
  • The most compelling argument that this guy gives is that we all had to learn to read and write in order to adapt ourselves to our society. And it's not easy, either. We spend enormous amounts of time, money, and energy trying to ensure that our children learn these skill, and appropriately so, since they're a critical part of our society. Saying the same is true for computers and that we all have to adapt, somewhat, to how they work is a compelling argument.

    Still, this sounds like the ramblings of some

  • by softspokenrevolution ( 644206 ) on Saturday November 08, 2003 @11:15PM (#7426975) Journal
    So, I'm reading this little article right, the man is here comparing the ability to read in general with teh ability to read and write code and laying a claim that in the future everyone must know how to read and write code because the current interfaces we have are illusions, or some other such nonsense, and that the ability to read an write code will be integral to ever profession.

    I think regular literacy is just great, in an information based economy and world, you really can't get along without the ability to understand the assemblage of alphabetic characters or pictograms if that is your preference.

    However, it is not necessary to know some really arcane computer programmingn code to be able to turn on a computer and use an internet browser. That's why we have computer programmers, to give us tools based on these machines that everyone can use, Windows is popular because you just turn it on and look at pictures and words in an easy to understand format (except on very poorly done websites).

    We must adapt to computers, sure we do, we need to bring out culture into an age where informations is availible at the touch of a button. What captain Dylan fails to understand is that though it may seem, these machines are not gods that enslave us with their cold bluish glow, but they're tools. He starts out with a premise and is just ranting about that, he has no real arguments, no real evidence to back it up besides the fact that he wrote a book, wow, that makes someone an authoritative expert on everything.

    Let's deconstruct shall we, here's my favorite line...
    This is yet another reason why Windows is such a dangerous commodity. It lulls us into the pernicious illusion that we can deal with computers without adapting to their logic. By presenting us with colourful screens and buttons for us to click on, Microsoft encourages us to believe that we can force computers to adapt entirely to our preferences for visual images, without having to adapt ourselves to their preference for text.
    Now, I get just as pissed when Windows crashes on me, or drops my internet protocols so I can't get on the network. But to say that Windows (specifically mind you, He doesn't talk about Mac OSes, you know the ones that windows holds an eerie resemblance to) creates an illusion that we can interact with computers on a level that the average person would appreciate straight out of the box is just unfounded, and I'll tell you why. Because it works, sure when everything was punch cards and cryptographic text you needed to have some specialized training to get computers to do what you wanted, but now that computing is something well into the mainstream people are picking it up much like any other bit of technology, like hammers (not the best analogy but whatever), I feel that Mr. Evans failed to grasp thtt the reason why computers are so indemnic now is that they do offer a clear and simple solution.
    In 50 years, perhaps much less, the ability to read and write code will be as essential for professionals of every stripe as the ability to read and write a human language is today. If your children's children can't speak the language of the machines, they will have to get a manual job - if there are any left.
    This part was pretty funny. For someone who wrote a book on evolutionary psychology he doesn't seem to grasp the concept of gradual specialization within a society. you see eventually as a society progresses, you get people whose job it is to think, and expressing that further you beginto develop proffesional groups as thought progresses. Eventually these professional groups become specialized themselves, like docotrs, sure way back in the day you went to your barber if you needed a good bleeding or if you wanted a limb off, but today there is a specialty for almost every concern. I look at it like food, not everyone is a farmer right? Yet, we all benefit from the product of a farmer's labor. Now agriculture has been around for let's say 10,000 years, most of us would be hard pressed if
  • Tools (Score:4, Interesting)

    by AdamHaun ( 43173 ) on Saturday November 08, 2003 @11:20PM (#7426987) Journal
    The people who make comparisons to microwaves and cars are missing the point. A microwave is designed to do one thing and one thing only. A computer is designed to do, well, anything. The idea that the interface of a general purpose tool should be as simple as that of a more specialized tool is silly. The whole *point* of a computer is its complexity.

    In this context, the article's assertion makes a bit more sense. People who use tools to solve problems need to understand the nature of those tools. Thus, people who use computers to solve new and interesting problems need to understand what a computer actually does before they even begin to work on a solution. Perhaps in fifty years time we will be using computers for much more interesting things in daily life than we do now. Given the existance of a near-natural language interface and voice recognition, tasks like word processing become trivial. The goal, then, is to be able to instruct the computer what to do in the most efficient way possible. Short of a strong AI, the only way to do that is to understand a bit about how a computer works.

    When you look at things this way, the article doesn't seem quite so extreme. I don't know about "every secretary in the world", but I can see plenty of circumstances where complex instructions would need to be turned into "programming" to get anything out of them.
  • by the uNF cola ( 657200 ) on Saturday November 08, 2003 @11:23PM (#7426998)

    Isn't this too much of a burden for the average computer user? Shouldn't we try to force computers to adapt to us as much as possible by giving them user-friendly interfaces and hiding their internal workings? Shouldn't we be able to get on with our jobs without worrying about what is going inside the black box? If that is your attitude, fine. If you want to remain inside the dream world of The Matrix, that's your choice.


    I'm a programmer for quite a long time. I've dealt with designing interfaces (not too bad at it) and implementing other's designs (some are really great, it's where i've learned). Keeping the insides-in and the outsides-out is what keeps our lives simple. It's also what makes interface programming such a friggin' pain.

    Let's take a screen that has just a simple checkbox. On.. off.. that's pretty easy.

    Something more complex: a set of radio buttons. If none are on by default, you have to add a check to make sure things are fine.

    Now let's add something like the slashdot post-comment page. Strip all "bad stuff", check that both aren't empty and check against a few rules.

    How about an international address form. City/state is in the US or CANADA, you check for zip codes of certain types. (I know these two off the top ya' brit's :P). In the USA? Then yuo have the state thing, but if it's not, you turn that into a province thing, but only for certain countries.

    Want to include a phone number? Forget it. In the US, it's an area code that doesn't begin with 1 or 0, doesn't have 3 repeating digits (I believe), prefix doesn't start with 555, 1,0 or a few other things. No symbols except possibly -'s in the right place...

    Now if this were done all premptively, warning you "no, you can't do that" along the way, it's one big pain-in-the-ass. Warning you after the fact that you can't continue is also another big PITA.

    But you know what? It's so very necessary. Anyone remember OS/2's SYS1375 error? I hated that frickin' thing. It was the equiv of a segfault or sigbus in OS/2, when a program crashed... something like that. But you know what, those overly-verbose messages are great when you are in charge of maintaining or creating a system.

    In the end, I want to be babied from A->B when going through some task or process that has an interface. I like the idea of not needing to consciously think that, "I have to create an image of a cd first THEN i burn it." That's one thing I like about CD burning tools vs cli's making an ISO first and then burning. I wish configuring a kernel to my system were that simple. It'd be nice if it all worked with autodetecting modules upon first-time startup.

    It's the difference of wasting those internal mental cycles of figuring out what's going on. If I wanted complex, I'd figure out how to read my phone bill. They like to send it in spanish, though I told them I want it in english. Morons..
  • This article is rubbish. Anything that can be reduced to arithmetic and boolean functions can be expressed on a computer. The gates on a CPU could care less what they're storing or calculating.
  • This is just silly (Score:5, Insightful)

    by DarthTaco ( 687646 ) on Saturday November 08, 2003 @11:30PM (#7427017)
    This guy is an idiot.

    By presenting us with colourful screens and buttons for us to click on, Microsoft encourages us to believe that we can force computers to adapt entirely to our preferences for visual images, without having to adapt ourselves to their preference for text.

    Computers have no preference for text. They have no preference for graphics. If they could be said to have any sort of preference at all, it would be binary. And that would still be a misleading statement.

    His goofy comments about html don't make any sense. HTML is just as artficial a construct as the graphics rendered by the browser engine.

    Does this guy think that you can just write some code on a piece of paper and show it to a CPU? The text on your screen is already an abstraction.

    Sounds like he has some problem with the fact that even idiots like himself can use a computer without any kind of in depth knowledge.

    And all this nonsense about forcing computers to adapt to us. WE MAKE COMPUTERS. They didn't "evolve" of their own volition. I'm surprised this guy isn't complaining about how using a steering wheel doesn't require knowledge of the actual steering mechanism.

  • it has a nice start. but it doesn't give any concrete reasons. WHY are people doomed if they don't want to learn to program?

    i agree entirely though, i was going to forward the article to everyone in the office, but it lacks reasons and explanations. *shrugs*
  • Bad writing (Score:2, Insightful)

    by tyrecius ( 232700 )
    A very important point that the author misses here is the fact that most people are pretty bad at literacy. This usually isn't very much of a problem. Even if I'm a bad writer I can usually get my point across to a co-operative reader.

    But a computer is anything but co-operative. Being a bad programmer in the computer realm is a much more serious handicap than being a bad writer in the literature realm. Computers are much less forgiving of mistakes. And there tends to be a lot more complexity engendered in
  • by KalvinB ( 205500 ) on Sunday November 09, 2003 @12:55AM (#7427301) Homepage
    If I want to make a 3D game, I can use OpenGL or DirectX. However, I have little clue how exactly telling OpenGL to put a quad with various features on the screen actually gets it on the screen with those features. For all I care it's magic. It just works. OpenGL and DirectX are my lowest levels of abstraction for 3D graphics currently. There's also DarkBASIC et al for those who don't want to go even that far down.

    If I want to make a 3D API like OpenGL or DirectX, I need to dig down deeper to understand how graphics cards work in order ot get any realistic amount of speed. Consoles tend to have fewer levels of abstraction.

    This is how it works in every area. You have the people on top who have little to no abstraction. They know exactly how every little thing fits together. And they get paid accordingly. The people on the bottom just give you your burger. And are paid accordingly. They don't care how the meat or buns or whatever got there or where the money goes aside from their pay.

    I can imagine that like all things, computers are going to reach a level of complexity that's just flat out absurd. Hobbyists work on kit planes but it takes years of training to properly maintain a Boeing 747. As the complexity of planes went up, the requirements for getting hired to work on them also went up.

    However as usual, there will be a handful of geniouses that understand everything who write abstraction layer unpon abstraction layer until a level is reached that it doesn't take a genious to get a polygon on a screen.

    All that will change is the amount of education you'll need to be able to function at a certain level.

    Planes are one thing but it used to be that to fix a computer you had to hunt for a vacuum tube or whatnot that was out and replace it. These days, if a computer breaks, within 5 minutes you can determine the problem then throw out the defective part and buy a new one with little training. It's actually gotten easier to maintain PCs. I don't have to try to find and then fix a transistor in the northbridge. I just throw the MB out and get a new one.

    So yes, you do need to know assembler for certain positions to earn a certain pay. But, there will always be other entry levels that don't require that level of knowledge. It's up to the individual to choose what level they want to strive for.

    In conclusion, dumb, not well thought out article.

    Ben
  • by SoupIsGood Food ( 1179 ) on Sunday November 09, 2003 @01:02AM (#7427335)
    Bah. Everything in information technology is a metaphor, an illusion to trick the human mind into coping with the machine. Where do you think the term "channel" came from? It's a metaphor, using a nautical term to boil down an overbearingly complex technical description into a concept non-technicians can understand when trying to get their television to show them Gilligan's Island.

    Saying the command line is "closer" to the way a computer "really operates" is preposterous. The command line itself is a metaphor, an abstraction that simulates lingual conversation, where a GUI is an abstraction that simulates tactile space.

    Most programming languages are based around the lingual metaphor... but not all of them. Prograph was a language based around manipulating shapes in a super-flow chart, and Helix is a relational database language based around the same concept, only in a declarative rather than procedural programming context.

    Computers aren't even remotely human... they aren't even remotely alive or self-aware. These are just anthropomorphizations people assign to the system, because they don't understand that the command line, the C++ language, the GUI, are simply anthropomorphic metaphors, conceptual hacks that empower the user.

    The very first Hollerith machine used on Ellis Island was very close to a GUI system. You plugged in the card, and turned clearly marked dials to indicate nationality, age, etc, which were punched into a card (stored im memory.) Information was read from memory by putting the cards in a reader, where the appropriate option was lit up on a menu of possible options listed in plain english, corrsponding to the nationality, age, etc, as it was stored on the card. It depended on tactile metaphor to store and visual metaphor to retrieve data, rather than an answer-response metaphor like a CLI. The only way to get closer to the metal is to put the bits into memory by hand with a hole punch.

    What's needed are better, newer, more empowering metaphors. GUI's engage the part of the brain that deals with tactile, pattern and spatial relationships, so they're a better metaphor than a command line in most instances. We need to transcend the GUI with a more involving illusion, not just swap it for an older illusion that doesn't take as much advantage of human neurology and psychology... like the command line, or job control language, or patch panels.

    SoupIsGood Food
  • I am just a humble caveman. Your modern user interfaces confuse and frighten me.

  • ``Projects promoting programming in "natural language" are intrinsically doomed to fail.''

    How do we tell truths that might hurt? [virginia.edu]

    A true classic.

Every program is a part of some other program, and rarely fits.

Working...