Forgot your password?
typodupeerror
Technology

Scientists And Engineers Say "Computers Suck!" 251

Posted by timothy
from the so-what's-new? dept.
drhpbaldy writes: "At the latest ACM meeting, scientists and engineers threw mud at computer scientists for not contributing anything useful. They lambasted CS types for developing complex and useless technologies. Some of the fault was placed on VC's for funding only the fanciest and stupidest technologies." Of course, when people say that "design" will save the world, they usually mean their idea of design, which might not jibe with yours or mine.
This discussion has been archived. No new comments can be posted.

Scientists And Engineers Say "Computers Suck!"

Comments Filter:
  • by Anonymous Coward
    When you toss coin, call heads tails, only coin land on inside, do you wither? When you stumble all clundyvoo like the stainless wazzjackets that populate oo voof many souls? Do you hander mascronian, or smell tobcoa all fleshy like? Will it be? Or will it soon? Eek leff von fools, all them be brandybums.

    The Rambler

  • by Anonymous Coward
    Actually, I'd heard that you can take advantage of the DirectInput part of DirectX to capture that key combination... can anyone confirm this?
  • by Anonymous Coward
    What are you talking about? Programming (for me, anyway) is ALL about the satisfaction from building something useful and the artistic delight of design - in programming, you build something from quite literally nothing - you create order from chaos. Programming is speech, but it's much more than that - to be a good programmer, you have to think in abstract ways and be able to think truly dynamically - static thinkers have no places in the art of programming. Anyone who says they are programming for *just* money is NOT an artist. Good code is truly poetry, and good programmers are truly artists.
    What you fail to realize, is that engineers write code too. It's one of many of our little tools. CS is just one small little tool in our belt of many.

    I could have gone to school for CS..but I figured why pay so much money getting a degree for something I could easily teach myself?

    Instead, I got my Aerospace Engineering degree with an emphasis in Fluid Dynamics and Automatic Control Systems (lots of complex coding in those things, btw).

    Computers are great tools, but I really doubt a CS guy could write the code I do...they simply don't understand the engineering stuff that needs to be calculated. Fluid dynamics is an art. you only learn how to solve the equation after lots of practice learning what can be dropped/substituted in the equations of motion.

  • I almost agree with the premise, with some conditions. Computers aren't doing anything near what they were and are hyped to do. Hal is still a long, long ways away. But I don't think that is computer scientists faults -- I think it is Intel and Microsoft's fault.

    These companies told the world that computers were ready to make their lives better. They made a lot of laughable statements that were, unfortunately, easy and desireable to believe. Then these companies mass marketed their products and made bundles of money. Imagine vulture, er, venture capitalists in 1910 saying "London to New York in 3 hours via plane!" This is what happened in the computer industry, and there has been a lot of dissapointment as a result.

    Consequently, Intel's research budget grew very fast, evidently much faster than they could improve their designs by the look of things. However, the companies that were making real advances in processors have been pushed out of business (next week, we'll discuss whether the "efficiency" of capitalism is really the right economic principle to maximize ;-). The same with operating systems. It's very interesting to see that the only successful competition for Windows is a piece of volunteer-built public infrastructure that grew on a schedule largely independent of "market forces".

    The term Artificial Intelligence (my research, sort of) is horrible, and has probably contributed to the disappointment. I don't think software techniques have matured much. Hardware and hardware processes have become much better -- memory densities, magnetic storage densities, even CRT manufacturing. But I really don't see any improvement in available software. At least with GNU/Linux, there's an attempt to find the right way to do things even if it takes several tries and isn't popular or financially rewarding.

    The best thing that has happened, by my estimation, is the interconnection of computers. Networks have proven far more valuable than so many other technologies like speech recognition and vision. Those technologies are very, very interesting, and it's proper for people to study them. But natural language processing has not had an effect on how we get through each day, yet, despite hype from the market.

    It's interesting, therefore to see how Microsoft, Intel, etc. hype the Internet. Watch how they try to spin their products to say that they add "value" to the Internet experience. An Intel P-MCMXXXI doesn't make the Internet any better. The important aspects of the 'net don't depend on Flash or Javascript, and certainly don't depend on Flash or Javascript executing on bajillion-megahurts processors. The Internet, the best thing to happen to come to the public from the tech sector (followed by handhelds, I think =-), is useful with or without Intel's latest and greatest. The internet is even better without Microsoft's latest and greatest Media-Hijacker. =-)

    The Internet is valuable for the transmission of information. Computers are valuable for the processing of information in simple ways, really quickly. Neither of these create new information in any meaningful sense--we still need humans or nature or whatever for that. But none of this sounds exciting enough to sell computers, and as a result Microsoft and Intel, etc., created the hype that has led to a large disappointment. They preached the land of milk and honey but delivered a desert (I better watch out for lightning bolts after saying that...).

    I like to say that these companies, and the whole PC industry, have been "taxing naive consumers." And now consumers are realizing that these companies have squandered their money. It is ironic, and slightly humorous if you've a strong stomach, that the academics are getting blamed.

    -Paul Komarek
  • Computer Science is not just "coding". There is also a lot of real research in the field, think e.g. about database indexing, operating systems, compilers, programming languages. All of these have their foundations in proper research. Otherwise you would still program in machine code, operating systems would not be able to run several processes, large amounts of data could not be processed.

    And indeed, for a scientist, the code is not the thing that is important, its the idea! Imagine a very simple thing, the quicksort algorithm. I can implement it in many different programming languages, but the thing is still the same. (BTW, my personal favorite for this is Haskell, which is really beautiful code:

    quicksort :: [a] -> [a]
    quicksort [] = []
    quicksort (x:xs) = quicksort [y | y =x]

    ).

    Sebastian
  • Sorry, some of the code got removed in the post...

    quicksort :: [a] -> [a]

    quicksort [] = []
    quicksort (x:xs) = quicksort [y | y <- xs, y<x] ++ [x] ++ quicksort [y | y <- xs, y>=x]
  • Users lose dialog boxes. I don't know how, but they do it.

    One thing I think I'd like to see is the rest of the app graying if a modal dialog box is invoked, making it clear that the dialog is "in charge."
  • Correct.

    To keyboard-reset a VMware session, you must use C-M-Ins, instead of C-M-Del.


    Rev. Dr. Xenophon Fenderson, the Carbon(d)ated, KSC, DEATH, SubGenius, mhm21x16
  • it comes down to licensing -- not technical issues. manufacturers to this day still have to pay the ``apple tax'' for each firewire port on their device, as opposed to USB where no such rediculous licensing is required.

    MO never took off because the various disk manufacturers could never agree on common formats. ZIP was similarly doomed from the start since only one company manufactures it.

    USB appears to be obsoleting serial and parallel for all practical purposes... it was showing up on PC motherboards before apple came out with the imac. it's becoming difficult to get a PC motherboard without USB. printers, scanners, mice, keyboards, even the beloved cd burners are now available in USB form, knocking out their bus-predecessors.

    I personally think compactflash prices and densities will eventually improve to the point that they will replace the floppy. it just needs to get cheaper.

  • The bare fact is that scientists rarely have to please any end-users, and they never have to please everyone. Programmers and computer scientists are always having to check with customers. Scientists just check with the people that hired them. Those people check with customers, but those people are very rich and can afford to not please everyone.

    People are stupid. People asked for stupid stuff on their software. The software is stupid.

    Poke fun all you want, but since the invention of photo-paper, science has contributed absolutely nothing to the important field of pornography distribution. Look how far we've come.
    -the Pedro Picasso

    --

  • Apple no longer includes internal FireWire connectors in their current hardware.
  • What does this have to do with computer science? The complaints mentioned in the article were were really about how the software industry was screwing up. I don't think it's fair to blame computer scientists, since CS research is generally ignored by software developers.

    It might be fair to blame the industry for not making usability a priority, but it's generally a low priority for customers too, and companies prefer not to spend resources on features customers don't care about.
  • Being a computer engineer I've had the opportunity to build physical devices, though these days I program for a living (for some reason people think that if you have the word "computer" in your degree then you know how to program. Go figure. I won't complain - it pays better ;-)

    I'd have to say that writing a well crafted piece of software is extraordinarily satisfying, but it doesn't come close to the satisfaction of having built a physical device. Unfortunately, physical devices are more expensive and time consuming to build. The way I see it, hardware design is the same as programming, only in a different medium. The advantage is the device you can show others when you've finished.

    Of course, probably the most satisfying thing I've ever done was to build a device based on a microprocessor, and to then write all the software that ran on it. :-)

  • It wasn't an aircraft carrier, it was the Aegis cruiser USS Yorktown.
  • Attention:

    All the scientists and engineers who agree with this article *must* stop using these obviously non-useful and pathetically convoluted software tools or risk being kicked in the ... knees for being beef-witted hypocrites.

    Have fun with your pencils and telephones.
  • Users lose dialog boxes. I don't know how, but they do it. I've spent the last two days making all the error pop-ups in my company's software not only modal, but stay-on-top and sticky too. Granted this is a highly specialized system, and theoretically missing one of those errors could put lives in danger, but how the hell do you lose an error box? It's right in front of you dammit.
    As if I didn't have better things to work on...
  • Uh... feeding the troll, sorry.

    From a Rhetoric perspective, which is better?

    A) constitutionally based representative democracry with socialist education system and fascist military, etc.?

    or

    B) USA-Democracy (which implies all that)?

  • of course, before the advent of computers, travel was all but impossible. Just look at Columbus's voyage to the New World. If Spain hadn't heavily invested in GPS software and IT infrastructure, how would they know where they were going. (Of course, had they solved the traveling salesman problem first, they could have gone to the East in only x ln x moves.)

  • by Anonymous Coward

    From Jonathan Walls

    This is insightful? Does it not strike moderators as pathetic to see a knee-jerk reaction to criticism, laced with bad sarcasm, insults and poor logic, pandering to the tastes of the audience?

    Especially in a community that likes to think of itself as intelligent and cutting-edge - you would have thought a bit more open mindedness could be expected. Anyone with the ability to see another person's point of view would acknowledge that using the Start button to stop, or requiring hardware knowledge to install an OS, and so on, is indicative of a situation that needs improvement. And remember this is criticism of an industry rather than individuals, so there's no point cherry-picking to prove a point.

    As for "computers are complex because your needs are complex", that sound like a pissing competition i.e. "My needs are so complex you could never design a simple [set of] interface[s] to solve them. Gee, you must be pretty simple if you think it's possible for you." Then you get, "my complex needs are inconsistent the needs of others", or in other words, "I am such an individual that noone could ever produce a simple solution that suits me."

    Personally, I want things to be simple, I'm not strutting around claiming to be a complex individual, with difficult to meet needs. For a start, such a person sounds like an arsehole. But more to the point, I have lots of simple needs. Take the example of doing my taxes - I don't want to, I want a simple process. After all, all the figures I provide are automated somewhere anyway, I don't want to expend any effort at all, I just want a simple solution. Such a solution would undoubtably have a complex back-end, take a lot of work if it's possible at all currently, and take some talented people to do it right. If I simply saw a one page print out at the end of the tax year with a breakdown of income and taxes I would be very happy (and rather impressed). Simplicity of interface, sheer useability, takes a lot of talent, skill and creativity.

    If the only example of an intelligent device you can think of is a computerised thermometer, I wouldn't hold much hope of ever getting a good job requiring any of these skills.

  • by Anonymous Coward
    As a software engineer I always strive to make things as complex as possible. :)

    The main problem is the toolkits/frameworks that are used for developing software. Most Unix toolkits really suck! What's even worse is that the language they are designed in, be it C or C++ makes such a mess, because those languages weren't designed for graphical interfaces, they are portable assemblers.

    If the world programmed in Smalltalk life would be much easier. Imagine if everybody had truely reusable classes. Although maybe that would put some programmers out of work. Using a specific language doesn't mean that code reuse will be well done, a lot of it has to do with the programmer.

    Maybe one of you has the idea that will push us past the archaic languages that we currently use.
  • This is just the old SCSI vs IDE argument rehashed.

    SCSI cards handle transfers, IDE makes the CPU do it. Intel wants to drive demand for CPUs, so do they push SCSI as the standard interface mfrs should use, or IDE?

    Same with USB vs. FireWire. (not to mention the NIH syndrome).
  • And you can walk into a store and buy all of these separate applicances. So how can engineers complain that the CS people aren't making them?

    Just because it is programmable doesn't mean a CS type programmed it.

    I also believe that the ACM comments went too far but they do have a point, although I don't think that CS is all to blame.

    So many people have the attitude that technology solves all of our problems. The thing is that technology seems to cause as many problems as it solves, and that blame can be spread everywhere, from consumers to politicians to VCs to designers to engineers to programmers, etc.
  • It's not that common that your programs will leap out of the screen and become a physical object? Even if you think you are creating something out of nothing, once you are done, I can't hold your program. I might be able to see it on the screen or read a printout.

    Programs are abstract. The only reason we use them is that they can be made to work with non-abstract objects. The thing is, most programmers only have access to printers and monitors for output methods, which aren't bad but IMO that can limit the experience and capabilities of the programmer.

    I am something of a programmer and something of an engineer, the benefit I get from that is that _I_ can make physical objects that are a result of my programs, or make electromechanical circuits that actually have real world interactivity.
  • That would probably be Java then because it's object model is largely SmallTalk's and the syntax, etc. is C-ish. Strictly speaking, there's not going to be one ideal language. FORTH ADA (95, not 83...) SmallTalk Java C C++ Object Pascal Oberon Modula2/Modula3 Each of the above mentioned solve problems. Some do a better job of dealing with solving a given problem than do others. Each of the above can be used for doing much of anything in computer science. But to use Java for an OS, well, there's raftloads in that list that would be better suited. SmallTalk is good for high-level stuff; but to try and make a game in either SmallTalk or Java would be either great or painful depending on the game (Quake III or UT in Java? Nice try, but no cigar...) The problem is people trying to find that magic bullet that will let them do anything with peak efficiency (as it is hard to try to remember past 2 languages, let alone something like 17 of them...)- there is none. There never really will be one (because there's always some better way of expressing oneself for a given set of "expressing oneself"- math, for example doesn't do good for expressing a novel, but it does do pretty good at expressing many Physics concepts.)
  • Really now... Do you know anything of the dynamics of SmallTalk?

    It's a late binding language, meaning that all operations on methods, etc. are determined at runtime. Better get more muscle to push the app.

    It's, generally speaking, an interpreted language- you're running on a VM, not unlike a JVM in almost all cases of a SmallTalk runtime environment. Better get even more muscle to push that app.

    It's garbage collected, meaning it's going to do evil things to you when you're trying to do something time critical and it decides to do garbage collection (which you don't have control over- nor does the paradigm of SmallTalk allow for that.). Better hope garbage collection can be handled in another thread and you've a SMP machine to use for your app.

    For some things, SmallTalk is great. For things like word processors, etc. it's a blessing.

    For many systems tasks such as UI's (Not app UI, something more like X (Unix) or GDI (Windows) or OSes, it's a poor fit. There's other good fits and bad fits- and making SmallTalk change to fit the ill-fitting things better, you lose much of the benefits that the language brought to the table and you might as well have been doing the thing in C/C++/ObjectPascal/ADA95/etc.

    As for truly reusable classes, SmallTalk doesn't make it magically so. It requires skill, even in SmallTalk to do that.
  • Even if vmware runs multiple operating systems at the same time?
  • If it's a shared terminal in a computing lab or somewhere, the secure attention sequence is very useful. Otherwise it would be trivial for the previous user to put up a fake login prompt and capture your password. (The 'schoolboy attack'.)

    I agree that it doesn't solve all the problems, but if you can close one out of three security holes, that's better than closing zero out of three.
  • ...by a good old CS vs. Eng. paintball tourney. Let's get it on!
  • I think it's interesting that Prof. Buxton, one of the most innovative researchers into human interfaces, is one of the people cited in this article. He's responsible for some very interesting work... er, that I can't properly cite because I'm not sure where to cite it from.
    But he's done very good work in making, say, Alias | Wavefront's software be very usable by artists. Technically minded artists, to be sure, but there is a level of intuitive access to the program that just isn't found in a lot of other packages.

    *n
  • You misunderstand my points and bring up a few distractions

    -your motor example
    1) A motor and spin.
    2) A motor can't do anything else.

    -camera
    My camera doesn't have a computer in it. Adding computers to a camera doesn't make it functionally better. The function is to image a scene, with a print as the end product.

    -computer isn't that great for drawing/modelling
    1) Show me the standalone word processor that is better than a PC at drawing.
    2) Show me the standalone computerized drawing pen that is better at word processing than a PC
    3) Sure the computer isn't the best at every task, but it's damn adequate at trillions of them, and that equals power.

    And another point that I didn't bring up before: The engineers who claim that specializing the computer into single use devices will improve it are trivially wrong.

    A computer can simulate *any* tool that exists. The only difference between the computer and the real world is that 1 to many ratio of form to function. If the engineers claim that a change of form will improve the computer, it is easy to show that the net result is a lot more forms, but no additional functions.

    So what their argument boil down to is simple human factors. They want to make computers easy to use by making them work just like objects we already know.

    So in the end we agree. Why do artists need computers to draw with? They don't. My point is that a computerized pencil won't change that, unless that computerized pencil is integrated into a device with quintillions of possible states: a PC.

  • >Targets of the critics' scorn included convoluted
    > commands such as the common "Alt-Control-Delete"
    > sequence used to close a program or perform an
    > emergency shutdown

    So the engineers are getting all concerned about human factors? I guess I wasn't aware that they had traded in their pocket protectors and slide rules.

  • My camera was built in 1964, so I'm sure. It can do everything that any modern camera can.
  • You're the one trying to make the point. You need to make it to my satisfaction.

    And, learn the definition of an open mind, please.
  • The Secure path in NT is Control-Alt-Delete. There is a very sane reason for this, it's not allowed to be intercepted by ANY application running under NT. Thus, you can ALWAYS know that the OS is in control when you do Control-Alt-Delete.

    You're missing the point. The operating system can trap whatever key sequence it wants - it is the operating system, so all keypresses are processed by it first. Of all the key combinations available on a keyboard, MS chose to use the combination traditionally associated with rebooting the system.

  • Their is no real reason your computer should stop playing music, printing, downloading or whatever because the OS is busy with something else.

    While smart peripherals would help, the real cause of the problem is poor software design. There are more than enough CPU cycles to do everything in a timely manner, but the operating system doesn't schedule the CPU correctly. Brain-dead device drivers also contribute to the problem.

  • Until they get the 3-D printers working better, the computer is just another tool. And sometimes a lot less useful than a lathe.

    OTOH, the potentials ...

    Caution: Now approaching the (technological) singularity.
  • I'm typing on a PC keyboard on a computer bought less than 12 months ago. Unfortunatley, this keyboard has a RESET button. It's located where the PrintScreen button should be. If you accidentally brush against the button when trying to hit Backspace, your computer immediately shuts down! I use Windows 2000 and it doesn't give you a chance to abort. On Day 2, I removed this key. ;-)
  • Programmers might not get the satisfaction of building something useful

    Au contraire, Rodney. Exactly the reason I left engineering is that no-one in their right mind was going to give me two million quid to make a fast ferry because some hung-over graduate thought it would have fantastic seakeeping. Computing, OTOH, if I think it could be good, I'll sit down and code it. Man, this is way creative.

    Dave

    DISCLAIMER: Sometimes you are going to have to make software to an engineering quality.
  • Take a second, look at almost every piece of electronic equipment that you own... most of it sucks ... design wise... not just including computer stuff...

    Why do I have turn 5 knobs and push 4 buttons to make my home theater receiver/tv switch from dss to ps2?

    Why do 90% of VCR functions only come on the remote, especially important stuff, like switch to the aux video source?

    why does every piece of software come with crappy default settings?

    why are we stuck with crappy interoperatability between anything? DV vs. D8mm, firewire vs. whatever, ide vs. scsi .. you name it ...

    i have a pda, cell phone, pager, email, etc. .. but for some reason, getting them all to work together in a synchronous, efficient manner is impossible, unless you of course get all your services from the same company, and who wants that?

    I know I'm generalizing, but these 'engineers and scientists' are the same jerks who've been pushing shitty technology down our throats ... if all engineers and scientists contributed to what makes sense and push good technology instead of market-speak, we'd would have had beta vcr's, be running linux, and stupid shit like DMCA/UCITA wouldn't exist...

    my 2 cents ....
  • I prefer a standalone DVD player to a PC. I prefer to use a Palm for storing addresses. PCs, even notebooks, don't carry around very well. I'd prefer to carry a mini MP3 player around than to carry a PC around. I prefer a PlayStation for many games over a PC.

    You mention a total of 3 products here - and they all have various reasons why they're superior. The DVD player is better on a TV normally because the screen is bigger. I would much prefer to watch a DVD on a monitor of the same size vs. a television of the same size, simply because of better resolution. So DVD players lose on this point in my opinion. And it's damn easy to watch a DVD on a computer - stick it in the drive and w/ AutoPlay it happens. Then it's easier to control the DVD it's self with a mouse then with a stupid remote control with a bad interface.

    The other 2 devices are an issue of portability - and they'd be more powerful if only they could get more power into that same space, which will happen in time. Why carry around both a Palm and a mini MP3 player when you can carry around one device which is more robust?

    Certianly you're not going to use your computer for a microwave, that's just ridiculous. But with a central computer that's powerful enough to not be upgraded every year and a half you have a lot more time to invest in add-ons. You get that DVD player, you get that huge monitor, you get those nice controllers to play games with that are just like your Playstation. And it all ends up cheaper.

    Imagine - instead of buying a DVD player, a television, a device to surf the web, a CD player, a tuner (you'd still need an amplifier of some sort), and a game console you buy a computer. There's several devices all rolled into one. Who wants all that crap laying around their house when they can have a central computer which powers all of this? And why can't this same central computer power multiple monitors, etc... It's a great deal.

    That doesn't get rid of the need for a portable computer, and your portable computer could even hook up to your central computer, but why carry around both a Palm and a MP3 player? Who the hell wants to do that? Why don't a throw a CD player and a tape walkman into my backpack just for good measure?

    I think people often confuse the idea of the desktop computer going away and computers becoming integrated into our lives. Of course computers are going to become integrated more in our lives. That takes time though, before it becomes really useful we need omnipresent wireless access with omnipresent IPV6 so everyone's toaster can be on line (http://toaster.myhome.com).

    But all together it's really annoying to hear scientists bitching about this stuff. Everyone's just under this delusion of internet-time and they think that the infrastructure of the world will change at that same rate. Infrastructure does not change over night.
  • There's a more pernicious problem: biotech needs a combination of boring, run of the mill CS that does a difficult job reliably and efficiently, as well as novel techniques and algorithms. The trouble is the first type of thing does not appeal to CS researchers, and the second doesn't appeal to software developers (often). Generally, academic research is more interesting than building guis, WebApps and sticking together relational databases, so academics are prepared to get paid less. Funding bodies are reluctant to admit this, so they won't pay the money required to get the software developers everybody needs to put together reliable platforms, and the CS researchers don't want to spend their time doing it (and probably don't have the practical experience either).

    The end result is lots of apps that are interesting from a CS point of view but completely useless to the people that paid for it. Or alternatively, dreadful from a CS perspective, but actually useful to the biologists that wrote them.

    This is, no doubt, a general problem in the experimental sciences, which increasingly rely on information technology for data analysis and programming.

  • Perhaps my bile was uncalled for, but I'm sick of people implying "good design is easy, why doesn't someone just do it?"

    People are always full of good advice that's harder to follow than they think.

    Making good software requires an intimate knowledge of the user that is often practically unobtainable until you have a nearly finished product. When you have a geekish user targetted, they can probably do a good job of describing their needs and reacting to generalized descriptions of UI approaches. Most normal people struggle to describe their needs and mentally "help you" to much by ascribing to the proposed software capabilities that bridge the gap between what you are describing and what they need. These people can only contribute well when you have a nearly workable user interface that they can actually work on and which you can observe.

    Only when you have a pretty functional product do you get the user feedback you need to throw out your bad assumptions, rip out your bad code and start from scratch.

    This is why RAD and rapid protototyping tools like VB, PowerBuilder and Delphi are useful. In my experience there's lots to hate about these things, but they do allow you to do a lot of experimentation with UI.

    I'm working now on a vertical market application that everyone agrees is very powerful, but most people agree is in many places hard to use. I am gradually improving things as I get to know the users better, but it is hard work and very risky -- what one person likes may be hated by another.

    The terms in which we sell software are a problem for us too -- push a button and whee! The world is at your feet. Improving user interfaces requires a considerable taste for crow.

    As software designers, we use a lot of stock approaches to things, and our tools have support for these stock approaches. The problem is that they are often a poor fit for tasks as the users understand them. For example, most RAD application tools have pretty good support for "master-detail" type screens. These were designed to handle typical header/line-item forms like invoice-line item break downs. It's tempting to use them for all kinds of 1 to many relationship -- except that unless you are talking about accounting it's an unnatural fit to most tasks.

    One interesting area I've been working on is PDA clients. It's been interesting because these stock approaches don't work on the PDA's limited screen real estate. This means that you absolutely have to go back to the drawing board and throw out the stock desktop approaches. In many cases the result is a more usable client. It's definitely inspired me to take a more clean sheet look at the knottier UI problems I have.

  • From the article:

    The essence of the speakers' complaints was that computer engineers have spent the last five decades designing computers around the newest technology--not for the people who use the machines. ... The vast majority of computers have few interactive features and are largely unable to forecast human behavior, Buxton said, rendering them less advanced than airport toilets that flush automatically when the user departs the stall.

    I think these folks are jumping into a middle of a huge cultural debate they don't understand.

    That is the computer as tool vs. the computer as agent debate.

    From your post, I'd place you in the computer-as-tool camp -- with the proviso that it is a novel and infinitely flexible kind of tool.

    The people cited in the article are naively jumping on the computer-as-agent bandwagon.

    I think a computer that understood what I wanted and did it for me would be a wonderful thing (if it didn't put me out of a job) provided we could build such a thing. But I think the longing for this has been created by the general abandonment of psychological and ergonomic principles as a guiding force to UI development in favor of stylistically driven designs (e.g. the put-it-in-a-toolbar movement of the early 90s and make-it-look-like-a-web-page movement of the late 90s).

    You wrote:

    The engineers are being engineers. Who can blame them? They like single purpose tools. Heck, we like single purpose tools too, and that's why we generally embrace the UNIX philosophy of making a program do one thing, and do it well.

    Having built a number of pathologically flexible interfaces myself, I can say with some authority that normal users want tools that do one thing well too. When a user wants to twaddle the flim-flam, he wants to click on the flim-flam and get a pop-up menu that does twaddle (an object centric design); or he'll live with a menu choice called "twaddle" that allows him to select the flim-flam as the target (a functional design). What he doesn't want is a tool that allows him to construct a search template that will match the flim-flam and compose a series of operations to accomplish twaddling.

    In other words, the user doesn't want to think about the tool you build, he wants to use it to accomplish his ends with the minimum of superfluous thought.

    What these guys are really craving are not intelligent tools, but intelligently designed tools. They're thinking on this issue is just fuzzy because they're coming in late:

    "If Rip Van Wrinkle went to sleep in 1982 and woke up today, he'd be able to drive our modern computers with no problem because they're essentially unchanged,"

    Since when can usability be argued as a sign of unusability? If Rip Van Winkle woke up from the 19th c., I could show him how to use my favorite power tool -- the sawz-all -- in about ten seconds. That is because its design is perfect for what it is supposed to do.

  • What if the system can't bring up a dialog?

    Isn't that what ctrl-alt-delete is for?

    Usually, in that case, the mouse will still work in Windows or X. In Linux I hit ctrl-alt-esc and my pointer turns into a jolly-roger. I then click on the misbehaving window. If your mouse won't move, you can either hit that reset switch (I hope your FS does journalling) or, in Linux, hit alt-sysrq-s, alt-sysrq-u, then alt-sysrq-b. That is, in order, sync all FS, unmount all FS (actually remount RO), and boot.

    Either way, modal dialogs will not work in many cases and you'll have to go to lower levels to recover somewhat cleanly.

    If there was an LCD and a couple of buttons on the front panel, however, I would fully support a confirmation.
  • Going to www.m-w.com we get

    jibe - see gibe

    gibe - intransitive senses : to utter taunting words. transitive senses : to deride or tease with taunting words

    Which is where the confusion comes from. In the sentence "I jibe him and you jibe with me while the yacht jibes" each jibe has a different meaning (taunt, agree, and change course respectively).

  • Um, not to burst anyone's bubble here, but most of my graduating EE/CompE class of 2000 is employed directly or indirectly as a programmer. What they program typically isn't windows, but more often than not embedded systems, control systems, etc. There _are_ software systems where instability is not an option, period - are you being fatalistic when you say that bugs are par for the course?

    Engineering was a great choice for a basis of a primary software-based career; Getting to build a computer from being tossed some ram, a CPU, a latch and some miscellaneous components was great experience and helps when you actually write software for a machine you didn't build (which in 99/100 cases is what actually goes on). It also leaves open the pure hardware side of the world too, in case the software industry blows up (which might happen, who knows).

    Engineer and Programmer are not mutually exclusive. This is also being posted from Canada, where a MCSE isn't enough to call yourself an engineer, either. :)

  • Also, which one of your devices (aside from the playstation) would be worth the plastic it was made out of without a PC it could dock/communicate/exchange-data with?


    Nah, all we really want is a standalone, networked hard drive that any of our separate devices can connect to/disconnect from while running.

  • Um, and exactly what is it that makes an AGP graphics card similar to e.g. a WinModem? It's not as if any major part of the "intelligence" required to drive my GeForce256 AGP graphics board is done by my host CPU, right? In fact, it's almost the other way around, considering that the card has on-board hardware for stuff like transform and lighting. I think AGP should be taken off that list.
  • Thanks for clearing that up. I got confused by my "responder's" confusion over AGP and USB... However, if I may be just a bit too picky, I think that AGP really is a port and not a bus. There can only be two devices ever using an AGP connection: the host CPU (through the chipset) in one end, and the graphics adapter in the other. You can't add another device to it. The 'P' in AGP is for "port". Still, in everyday use it "feels" quite a lot like PCI, so...

    Also, I do think that "original" plain vanilla PCI operates at 32 bits, and 33 MHz for a total bandwidth of ~132 MB/s. There are versions (I don't know their exact names) that use 64 bits and/or 66 MHz, though.
  • I'm neither a computer scientist nor an engineer, but I must at least take issue with part of the complaint. The criticism that "Wall Street rewards needless complexity and shuns those who build the most simple, human-centric devices" seems simply ill-informed. The bulk of VC money in the late nineties didn't get tied up in companies developing "needlessly complex" technologies. Consider:

    1) theman.com received $20,000,000. Rather than suffering from needless complexity, it suffered from needless simplicity. (A website that advised Gen-X age men on good gift ideas for moms, or free pickup lines for cheap chicks?)

    2) boo.com received $130,000,000. Their website suffered from needless complexity, but one could hardly say it was the fault of computer scientists (unless you consider flash animators and guys who sorta know javascript as computer scientists).

    3) DEN received $60,000,000. They made 6 minute long short films targetting demographics supposedly ignored by television programming (latino gangland culture, teenage christian dungeon & dragon players, drunken fraternity morons, etc.). Needless stupidity, to be sure. Anything but complex.

    4) Eve.com wasted $29,000,000 of VC money to build an ecommerce site for cosmetics and other ephemera for females. (The pitch to the VC, Bill Gross of Idealab, took 90 minutes, and didn't involve any computer scientists)

    5) iCast.com cast $90,000,000 at streaming video. They're dead, too.

    The list goes on and on. There is over a quarter of a billion above thrown at companies founded not by computer scientists but by:

    A poet & an ex-model, a couple of ex-hollywood honchos, previously unemployed MBAs and other non computer scientist types.

    FWIWIA.
  • Not really.

    1) If you're using a public terminal or something similar, the people who provide it can probably just record the keystrokes at the keyboard level.
    ** Third-party hardware cannot be made secure by the addition of code to one component **

    2) If this is your private system which has become compromised, secure login info is the least of your worries.
    ** Local machines are not made secure by the addition of code to this one area **

    This is a really old, mostly useless standard left over from the rainbow book series. It looks good on paper, but won't get you far in the real world.
    --
  • As a CS/Scientist in biotech, I am well familiar with these issues. The problem that the scientists are really hitting on is a perceived lack of communication between the CS field and the rest of science. Don't get me wrong, not all CS people don't communicate with thier collaborators and not all CS people need collaborators, but some do and they don't have any. Doing cs in a vacuum when you are developing tools for others to use is really frustrating for those of us who need the tools but they don't quite do what we need.

    -Moondog

  • See, there's these things called modal dialogs that prevent the user from taking any further action unless confirmation is received.If there is an action that could be incredibly destructive to the users data (like shutting down), you pop these one of these suckers up and the user will have to either confirm or deny that they made the decision for the dialog to close. When you employ such user interface design conventions, you can do things like put a power-up/power-down key on the keyboard. User hits power key on keyboard to start computer, and when they want to shut down, they hit the power button again and click on the shut down dialog button to confirm. It's just that bloody simple.

  • Actually, this makes a lot of sense. A God of creation who is also a God of destruction. Apollo was the God of sickness and healing.
    In the M$ world, c-a-d is the most powerful incantation, only to be used at times of great stress. Compare init(8). Admittedly, init is too great a God to involve himself in starting a user's session.
  • Warning of goat sex link!
  • "If Rip Van Wrinkle went to sleep in 1982 and woke up today, he'd be able to drive our modern computers with no problem because they're essentially unchanged"

    It's easy to criticize modern computers, as their user interface is not modern. Designing a legacy human interface was a calculated decision however. People are accustomed to the windows (as in the object, not the MS software) interface, and when things change people get scared. When people get scared, money stops flowing.

    From a human interface standpoint, computers might as well be aliens from another planet. We taught them to speak to us with windows about 20 years ago (don't nitpick time with me :) and now that is the de facto standard. Computers that don't "speak that language" are considered toys in the public eye (see PS2, furbies, games on cell phones).

    The essence of the speakers' complaints was that computer engineers have spent the last five decades designing computers around the newest technology--not for the people who use the machines.

    I don't think it is appropriate for them to suggest computer interfaces have become obsolete because no one was paying attention, or because no one cared to advance the interface. On the contrary there is a great deal of research on the subject, any computer science student has probably taken a human interface course or pieces therein (I did).

    I think another big problem is that it's posh to be one of the "tech elite" in the business world. Someone who can handle their computer is generally considered more skillful, and seems to have more potential than one who can't. Logically this is because they are able to learn new things, and have no difficulties with abstraction. That is important in business, and in life.

    Anyone agree?

  • I am now a 16 year old kid.

    ...and it's '16-year', not '16 year'.

    --
  • For example, he said, computer users must know that to turn off the computer they have to click on "Start"--not an intuitive step to end a computing session.

    You know, when i want my computer to shut down, i just type "shutdown."

    maybe i want to reboot the computer....i type "reboot"

    I don't think most Scientists are wrong for flaming the computer industry, but there is innovation out there....they're just looking in the wrong places ;-)


    FluX
    After 16 years, MTV has finally completed its deevolution into the shiny things network
  • So does that mean they're giving Microsoft money?

    ---

  • So the engineers are getting all concerned about human factors? I guess I wasn't aware that they had traded in their pocket protectors and slide rules.

    I guess the point is that while it may take the intelligence of a rocket scientist to run some systems, the rocket scientists would rather be working on rocket science, not computer science.

    ;-)

  • The links for the original and related stories are here [cnet.com]. The original story in the news report is here [cnet.com], and is much long than the Yahoo Article.

    To a large degree, even though it is not named, well, for example there is this bit:

    Targets of the critics' scorn included convoluted commands such as the common "Alt-Control-Delete" sequence used to close a program or perform an emergency shutdown. They also lambasted computer designers who refuse to distribute the machines' intelligence to smaller devices scattered throughout the home, instead insisting on packing a single box with maximum functionality.

    Strangely, this sounds rather familiar. Certain large companies will not be named. They do not have to be. The marketroids have strangled the future.

  • You know, there's a reason it's such a 'convoluted' command, It keeps people from accidently executing it!.

    I think that's hardly the point.

    The point is that Ctrl-Alt-Delete is totally nonsensical from the general user's perspective. Why on earth should that mean "reset?"

    My choice of solution would be a reset button that you have to hit two, or maybe three times, in close succession.

    You wouldn't even need to document it; I guarantee you that, when a single push doesn't work, every single user will respond by hitting it repeatedly, and before long, they'll realize that you need to hit it more than once.

    --

  • Perhaps you should check out the new technology from Apple. The iPole [bbspot.com] and it's "companion" product, iHole, should be sure to "satisfy" the needs of computer user.


    "Everything that can be invented has been invented."

  • heh. Yeah, I was on-track for a PhD. Unfortunately, I quit when I was in 7th grade and I am now a 16 year old kid. That's just kidding, what I said in my first post is is true.

    ...and DON'T FORGET TO CAPITALIZE YOUR I, MISTER!!!! ...and it's '16-year', not '16 year'.

  • Since I am the grammar nazi:

    jibe: To be in accord; agree: Your figures jibe with mine.

    jive:
    1. Music. Jazz or swing music. The jargon of jazz musicians and enthusiasts.
    2. Slang. Deceptive, nonsensical, or glib talk: "the sexist, locker-room jive of men boasting and bonding" (Trip Gabriel).

    I'll let you decide which version that our friend timothy meant.

    From our friends at dictionary.com [goatse.cx].

  • Small businesses and individuals are going to have to get used to hiring professional support to maintain their computers.

    Well put! Why do people go to garages to maintain their cars, but don't ever hire someone to maintain their computer?
    Laugh at me: but I helped tons of people by administering their machines....for the little price of 4 six-packs per consultation. I would have done it for free, but most people want to give you something for your trouble.

    Actually, where I live I have seen computer-troubleshooting companies for the case individuals have got problems with their computer. (Actually I think it is too late by then...a properly maintained computer has no problems.) I don't know what they are worth, but it seems to be working out quite well since they still are in bussiness. Not everyone seems to know a nerd to help them, it seems.

    Besides, I think that using a computer is like driving a car...you need a minimal training to know what you are doing. (Don't flame me about people-safety: I know that badly using a computer can't hurt people, but badly using a car can...) Training complete newbies is very hard: I started to teach my mother, who never touched a computer in the past 50 years, to email with Eudora. Working with the mouse/keyboard all together causes great problems to her. I can imagine she is not the only one out there.

  • "One single finite computer cannot do an infinite amount of things"

    Strictly speaking, that is also incorrect - said computer can achieve said task if it is allowed to run unhinder for an infinite time :)

    --

  • How can they even say that? Think about all the amazing innovations computer scientists have come up with! One click shopping! Great Porn sites! PORN SEARCH ENGINES! COMPUTER PORN! PORN!
    Funny... I always thought Porn was invented by lawyers, since they seem so focused on screwing everybody else, and recording it for posterity.

    But hey, at least the Internet was invented by a non-computer scientist, Al Gore!
  • Well, voice-operated computers sounds like a neat idea, until you stop to consider the consequences:

    - I dont know about you, but I can usually type faster than I can talk.
    - Imagine yourself speaking to your computer for 8 hours straight, 5 days a week. Heck, I doubt even professional speakers do that sort of thing.
    - A room full of people typing and clicking away is slightly noisy. A room full of people talking to their computers would be quite stressing.

    So, all in all, i'm ok with using keyboard and mouse to work on the computer. Now, what I'd really would like to see in reality would be a functioning Holodeck. Playing VR-Quake would be sooo cool!
  • When trying to install the beast, the provided drivers claimed a windows file was missing (original Win 95 upgrade). The file missing was the USB driver! The USB driver would not install on a non-OSR 2 version of Win 95. It is very easy to see where I got misslead into beliving the video card will not work without the "USB support" I guess the USB support is unrelated to ther video card driver but what is missing just happens to tag along with the USB support driver. If anybody has the real iformation, let me know! Any MS driver info seems a little vague.
  • I may have been off base on the video card, but I got burnt on an AGP video card. I updated my old box and put the old WIN 95 upgrade on it. It does not support USB, therefore the new card with new WIN drivers could not display anything above 640 X 480 at 16 colors. That sucked big time on a fancy 8 meg upgrade video card. It said compatible with windows 95 on the box. It just failed to mention it was not compatible with the non OEM OSR2 version of Win 95.

    I did not spend another $100 upgrading the OS to support it. I also did not pirate a copy of OEM Win 95 or 98 to support it. I jerked it out and went to a PCI card instead and just blaimed it on the OS and WIN hardware. I learned the hard way that the AGP port is a USB device and the original Win 95 does not support USB, even with all the service packs installed. By the way, Linux supports it ;-). I later upgraded the OS to Linux.

  • Brain dead device drivers are only part of the problem. Devices without a controller and memory buffer are the other. Thank goodness hard disks took a step in the right direction on that one. IDE drives have integrated drive electronics AND lots of buffer memory. The CPU can just shove stuff at it and go back to something important like rendering the next scene in Quake with out skipping frames. The less time the CPU spends on being a modem and printer, the more time it has to render smooth video and service mouse interupts. Now if there was only a way to buffer the sound card and CD so CD reads don't break up the sound and drop video frames.
  • I think they have a point in the MS area. Notice how all the hardware becomes WIN hardware with less smarts of their own? Examples I can think of are AGP video cards, Win Printers, Win Modems, and WIN sound cards. Their is no real reason your computer should stop playing music, printing, downloading or whatever because the OS is busy with something else. Put the smarts back into the devices so they can again buffer data and function on their own. A win modem is a waste of CPU cycles, even if it can voice answer the phone. My new computer can't even play the startup wave file properly. It stutters 3 or 4 times because the cpu is busy with disk IO. The cheaper the better concept has hurt the quality of the design.
  • I think most will remember a certain quote that came out of Redmond regarding the fact that no matter how fast Company X makes processor A, Software B will be able to slow it down.

    Now, I'm not accusing anyone. I'm not saying all software developers are out to screw over the hardware people, but look...

    Those who write the software are the last stage. Regardless of how well the engineers designed the hardware, the CS people can either make or break their designs with good or bad code respectively. CS people essentially have engineers at their whim.

    So yes, I certainly agree they're jealous... but in more than one way. They're jealous because CS people, in a way, have more power over the flow of technology.

  • I guess you have great users who always read the dialog boxes, eh? Bet they never open ILOVEYOU.jpg.vbs either.
  • Bob Pease of National Semiconductor has written several articles poking holes in the Taguchi Method. One example was that of a voltage regulator designed by the Taguchi Method - it was very insensitive to part tolerances - However it didn't regulate worth a damn.
  • by Panaflex (13191) <convivialdingoNO@SPAMyahoo.com> on Wednesday March 14, 2001 @02:18AM (#365253)
    I hate to burst your bubble.. but AGP has nothing to do with USB. The problem with Win95 rev 1 is that it simply doesn't support either one (or much more than minimal broken agp, IIRC).

    AGP, PCI, USB, IEEE1394, ISA, EISA are all busses.

    AGP is an design extension of the PCI bus which allows for convienient memory mapping (Allowing host memory to be used for video mem, pooling and locking), different clocking, and different DMA strategies. Think of it as an extended PCI specification.

    PCI was a complete redisign of EISA, with particular interest in bus speed, and wider bus transfers. Best of all was autoconfiguration of IRQ, DMA, and port mapping. PCI operates at 66MHz.

    USB = Universal Serial Bus. It is a chained 4 wire serial bus that has much more in common with ethernet than with AGP. It's basically a transmit/receive bus. IEEE1394 is very simular.

    EISA and ISA are old standard busses which oftentimes required hardwired IRQ, DMA, and IO ports (because of it's inability to autoselect empty slots and lack of a decent bus controller. These were typically 8, 16 and (EISA)32 bit busses. And they were way slow, operating at 4 MHz or so.

    So there you have it.
    Pan
  • by jfunk (33224) <jfunk@roadrunner.nf.net> on Tuesday March 13, 2001 @11:16PM (#365254) Homepage
    And then when it comes down to it, nobody wants to buy a specialized piece of a computer when they can get their generalized computer to work.


    I prefer a standalone DVD player to a PC. I prefer to use a Palm for storing addresses. PCs, even notebooks, don't carry around very well. I'd prefer to carry a mini MP3 player around than to carry a PC around. I prefer a PlayStation for many games over a PC.

    I'd prefer it if my microwave had it's own embedded computer for timing, rather than having to hook a PC up to it in order to cook up my KD. :-)*

    Judging by sales, I'd think the general public agrees with me, too.

    Fact is, it's simpler to just hit a single button on a separate physical device than it is to hit a bunch of buttons on one. It seems that many programmers completely forget about ease-of-use on a physical level.

    Of course, I'm just a grumpy old engineer, and an embedded one at that. I guess I'm the guy you're all rallying against right now...
  • by dimator (71399) on Wednesday March 14, 2001 @12:48AM (#365255) Homepage Journal
    I prefer a standalone DVD player to a PC. I prefer to use a Palm for storing addresses. PCs, even notebooks, don't carry around very well. I'd prefer to carry a mini MP3 player around than to carry a PC around. I prefer a PlayStation for many games over a PC.

    Ya, it would be great if there was a standalone word processing device that I could go to, to do my papers. And then, one right next to it that would do spreadsheets. And next to that, one that would check my email. And one more, with a nice big monitor, to browse the web! Seems kind of wasteful, we should just make all these functions on one device? Wouldn't it also be really cool if said device could play my mp3's, or play games, or play dvd's? Oh wait.......

    I don't buy this multiple device idea. While it might be true that the devices you mentioned are doing well in sales, arent they a little more specific in purpose then the tasks I mentioned? The PC has lasted this long due to its general applicability to a slew of applications.

    Also, which one of your devices (aside from the playstation) would be worth the plastic it was made out of without a PC it could dock/communicate/exchange-data with?

    Of course, I'm just a naive young software-developer, and I'd be out of a job if not for the PC. :)

    --
  • by dimator (71399) on Tuesday March 13, 2001 @08:14PM (#365256) Homepage Journal
    There are a lot of technologies out there that suck. Computers have many problems. But "have contributued nothing useful"? How many of these scientists and engineers would be where they're at without computers? Indeed, how many of them would have been able to schedule, arrive at, and execute their trip to this meeting?

    I dont know why they would say such a stupid thing... I'll assume we all took what they said out of context/too seriously.

    --
  • by mduell (72367) on Tuesday March 13, 2001 @08:16PM (#365257)
    And makes it even more confusing for grandmothers trying to logon to an NT box.
    "Control-Alt-Delete, but wont that stop it?"

    Mark Duell
  • by adubey (82183) on Tuesday March 13, 2001 @08:18PM (#365258)
    Of course, when people say that "design" will save the world, they usually mean their idea of design, which might not jibe with yours or mine.

    No timothy, when they say "design", I beleive they are referring to things like usability testing. In other words, taking a software package to groups of users, and designing statistically sound experiments to see what users find easy and fast to use. In other words, users ideas of good design - not yours, not mine.

    If you're interested, maybe read some [asktog.com] sites [useit.com] on design. [acm.org]

    Moreover, I think they are also saying that VC's should at least be aware of what theoreticians are thinking about so they make better use of their investor's dollars

  • by Ukab the Great (87152) on Tuesday March 13, 2001 @08:44PM (#365259)
    The moment someone designs technology as an ends and not as a means, that technology is issued a death sentance. It might be commuted for 15 or 20 years, but it will eventually happen. The PC isn't dying, it's been slowly murdered for the last two decades by many companies (one in Redmond Washington comes to mind) who have made the PC so ridiculously difficult to use and maintain that people are being driven to use network appliances. For many years, makers of software and hardware have lost touch with the needs of their consumers. The latest buzzword compliant technology gets higher priority than what could actually help someone use their computer more efficiently and effectively. The perfect example (from soooo many to choose from) would be the 3.5 magneto optical disk. It was rewritable, damn reliable, as small as a floppy and, if it would have been produced in massive quantities, massively cheap. But that didn't meet with the agendas of the technology industry. They backed zip drives and superdisks that were far less reliable and held far less data. When it became absolutely critical to hold data sizes larger than 100+MB, they came up with another kludge: CD-RW--Technically ungraceful (has to rewrite the entire disk every time written to), has a file-system that requires special software (for windows and I think mac) to read, and still has trouble fitting in your pocket. Yet another missed opportunity for the tech industry.

    One more example (this time in the present), firewire. Apple, one of the few companies to move computer technology ahead (despite all of its numerous business/PR flaws) has started putting internal firewire buses in their computers. Why didn't any other computer/motherboard companies think of this? Don't they understand that firewire cables are far less of a hassle than ribbon cables, and block airflow far less? Don't they reckognize the ease of use of being able to chain FW drives together? Don't they understand that external firewire is probably the easiest way for non-geeks to add new hardware (without the need to buy hubs)? But where is intel? Where is Western Digital? Where is Seagate, or Asus, or Abit, Tyan, or any of the others? Nowhere, that's where. In fact, they barely put any stock in USB. Rumor has it that when apple announced that it was killing serial and replacing it with USB, an Intel executive called Steve Jobs to thank him for taking the bold move "Getting all the others [OEMS] to go to USB was like herding cats".

    To capitalize on the obvious pun, technology sucks because too many people are pussys

  • by Mox-Dragon (87528) on Tuesday March 13, 2001 @09:08PM (#365260)
    Programmers might not get the satisfaction of building something useful and might not experience the artistic delight of design, but we at least don't have to work as hard. And when it comes to the bottom line, that's all that counts.

    What are you talking about? Programming (for me, anyway) is ALL about the satisfaction from building something useful and the artistic delight of design - in programming, you build something from quite literally nothing - you create order from chaos. Programming is speech, but it's much more than that - to be a good programmer, you have to think in abstract ways and be able to think truly dynamically - static thinkers have no places in the art of programming. Anyone who says they are programming for *just* money is NOT an artist. Good code is truly poetry, and good programmers are truly artists.
  • by proxima (165692) on Tuesday March 13, 2001 @08:21PM (#365261)
    Industrial designers poked fun at virtually all facets of computers and other electronic gadgets, and the Apple iMac--displayed in PowerPoint presentations in its groovy new shades

    Funny..computers appear to be useful enough to give PowerPoint presentations to a crowd to quickly and easily present information to a large group. I find it a bit hypocritical they'd bash computer design and ease of use and use PowerPoint instead of some other presentation medium.

  • by lpontiac (173839) on Tuesday March 13, 2001 @11:49PM (#365262)
    I prefer a standalone DVD player to a PC. I prefer to use a Palm for storing addresses. PCs, even notebooks, don't carry around very well. I'd prefer to carry a mini MP3 player around than to carry a PC around. I prefer a PlayStation for many games over a PC.
    And you can walk into a store and buy all of these separate applicances. So how can engineers complain that the CS people aren't making them?
  • by Megahurts (215296) on Tuesday March 13, 2001 @08:20PM (#365263)
    I'm a college student who recently decided against continuing a major in computer science, primarily because the code bases I've worked with have been so horribly designed that they're beyond repair. The way I see it, we've (Americans, that is. I know much of the world is quite different) become quite fixated on the miracle of computers. But very few people ever actually learn how they work or how they can be properly and efficiently integrated into our lives. So we then get bad designs from hardware and software vendors who realize that there's a large number of people unwilling to make the investment in knowledge necessary to choose the good from the bad, and will buy anything they see on a billboard, on the television, and (decreasingly) in magazines for entirely superficial reasons. Had they known better they could have avoided the junk or at least returned it for a refund, economically deselecting the implemenators of inferior technology from the economic gene pool.

    In explaining such issues to friends not familiar with the industry, I'll often draw parallels to similar situations. With this one, I'd say the computer craze is now at the point the car craze was in the late 1960's. Hobbyists are still common but on their way out. More and more people want the physical ideas of the technology eschewed for its practical purposes. Perhaps this economic turn is analogous to the oil crisis. (and quite similar. I've heard that at least some of is due to the californian legislator and power companies scratching each others back to create the energy crisis out here. Personally, it wouldn't surprise me, since I feel absolutely no trust toward the motives of either group)

    ---

  • by madcow_ucsb (222054) <slashdot2@ s a n k s . net> on Tuesday March 13, 2001 @10:42PM (#365264)
    Second, they recommend creating "simpler" and "distributed" devices instead of monolithic boxes that do everything. What the hell does this mean, what devices really need more intelligence? All I can think of is one of those computerized thermostats. Whoopee.


    Seriously...I just have visions of what would happen if my appliances started communicating with each other...

    Fridge: Ok everyone, we know Alex has a final tomorrow at 8am.

    All Kitchen Appliances: *evil laughter*

    Fridge: Everybody turn on in 3...2...1...NOW!

    *All appliances in the house turn on at once*

    *Circuit breaker trips*

    (At 11am the next morning)
    Alex: NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO!


    You know it'll happen. One day. When you least expect it. They'll turn on you.
  • by PD (9577) <slashdotlinux@pdrap.org> on Tuesday March 13, 2001 @08:15PM (#365265) Homepage Journal
    the engineers have bought into the myth of the dying PC. Horsepucky. The PC is here with us forever, and as time goes on more and more things will be integrated into it.

    Distributed systems are a nice thing in principle, but some problems can be broken up only so far. And then when it comes down to it, nobody wants to buy a specialized piece of a computer when they can get their generalized computer to work.

    Look at the history of tools. First there were separate tools, each one doing a single or small number of jobs. Even a Swiss Army Knife was limited to about as many tasks as it had specialized attachments.

    People like to poo poo the computer as being "just another tool". But the computer is far far different than any other tool that came before. The computer has the ability to be an INFINITE (or at least huge enough that you won't exhaust the possibilities in the lifetime of the universe) number of tools.

    The engineers are being engineers. Who can blame them? They like single purpose tools. Heck, we like single purpose tools too, and that's why we generally embrace the UNIX philosophy of making a program do one thing, and do it well.

    But the difference is that our specialization is in the software, and the specialization they are proposing is in the hardware. If I want a single purpose tool, I don't need a computer to get that.

  • by TheDullBlade (28998) on Tuesday March 13, 2001 @08:23PM (#365266)
    Angry denial reiterated.

    Supporting claim. Second supporting claim.

    Revelation of inconsistencies in the complaints.

    Setup for attempt at witty attack on academics.

    Punchline of witty attack.

    ---
  • by ka9dgx (72702) on Tuesday March 13, 2001 @08:33PM (#365267) Homepage Journal
    The Secure path in NT is Control-Alt-Delete. There is a very sane reason for this, it's not allowed to be intercepted by ANY application running under NT. Thus, you can ALWAYS know that the OS is in control when you do Control-Alt-Delete. This is one of the GOOD features of the operating system, and helps prevent a trojan horse from taking your password.

    It's too bad Microsoft couldn't build applications the same way, safe from Trojan Horses.
    --Mike--

  • by alewando (854) on Tuesday March 13, 2001 @08:22PM (#365268)
    When engineers sneer at computer science, I just chuckle to myself. Because I know something they don't know: they're just jealous.

    Engineers are jealous of programmers. It's that simple. Programmers have an easy life, after all. I only work a few hours a day, get paid big bucks, and for what? For telling a machine what to do. For the act of mere speech. It's Rapunzel's tale incarnate: I spin words into gold.

    Engineers have too many constraints; the standards are too high. When the Tacoma Narrows bridge fell down, heads rolled. But when a bug in the latest version of NT disables an aircraft carrier, Microsoft doesn't get blamed at all. Bugs are par for the course in our industry, and we have no intention of changing it. It means higher profits from fixes and lower expectations. How are engineers supposed to compete with those sorts of odds?

    I admit I considered going into engineering when I started my college days, but I was quickly disuaded. The courses were too involved, whereas the CS courses were a breeze for anyone who didn't fail calculus. And I don't regret it at all, really.

    Programmers might not get the satisfaction of building something useful and might not experience the artistic delight of design, but we at least don't have to work as hard. And when it comes to the bottom line, that's all that counts.
  • by root (1428) on Wednesday March 14, 2001 @06:16AM (#365269) Homepage
    They lambasted CS types for developing complex and useless technologies.

    That's because when someone comes up with a useful technology, even something as simple as LZW compression or an MP3 encoder, NO ONE ELSE CAN USE IT in their product. Writing products that use someone else's file format is called a "copyright violation". Standardising on one crypto algorithm is called patent theft. CPUs with compatible MMX instructions gets you sued by Intel. Making DVDs playable on "non-approved" systems gets you jailed, or orders from people halfway around the world.

    So yeah, "CS types develop complex and useless technologies." because we have to carefully avoid reinventing someone else's wheel or we get sued into bankruptcy.

    One result is millions of different wheels of different diameters, shapes and track widths that are all incompatible with one another. Sounds pretty messy, right? It also happens to resemble what we see today in the computing industry.

    The other result is people getting fed up with all the incompatibilities and looking for a standard, any standard. And since the standard is proprietary, naturally this will favor the growth of monopolies, e.g., Microsoft, who thes uses their position as OS "standard" to create other standards, such as Excel and Word formats, whilst actively blocking anyone else from participating in that standard.

    IMO both patent lifetime and copyright lifetime ought to be cut to 10 years tops for all things computing related, hardware or software, because stuff in this field ages faster than any other traditionally patented and coyrighted work.

    And there needs to be an irrevocable expiration for abandoned patents and copyrights too. It's absolutely insane that Atari 2600 games are still locked away by copyright, while no one is prodcing them. And they'll be locked away for over a century under current IP law. Is this right?

  • by hugg (22953) on Tuesday March 13, 2001 @08:42PM (#365270)
    If it wasn't for us software guys, you scientific types would still be writing programs in Fortran.

    Oh that's right, you ARE still writing in Fortran. My bad.
  • by cje (33931) on Tuesday March 13, 2001 @08:33PM (#365271) Homepage
    Anybody remember the original Apple II?

    The RESET key, located at the top-left corner of the keyboard, triggered a software reset. This had the effect of (depending on the software you were using) terminating the program and dumping you back to a BASIC prompt or erasing whatever unsaved data you had or doing a hard reboot of the machine. Users quickly found out (the hard way) that this button was way too easy to press by accident. In fact, this problem was so pervasive that magazines such as Creative Computing began advertising for "RESET key protectors" .. typically these were pieces of firm foam rubber that you would place underneath the RESET key (you had to pry up the keycap) .. resulting in a key that was still "pressable", albeit with a bit more effort.

    In later versions of the Apple II/II+ (and in subsequent machines such as the IIe, //c, and IIgs), Apple listened to their users' complaints, learned from their mistake, and required a Ctrl-RESET combination in order to actually trigger the reset. That hard-learned lesson carried over to other hardware and software manufacturers, including the choice of Ctrl-Alt-Delete.
  • by suss (158993) on Tuesday March 13, 2001 @08:10PM (#365272)
    Targets of the critics' scorn included convoluted commands such as the common "ALT-CONTROL-DELETE" sequence used to close a program or perform an emergency shutdown.

    Put it under F1, see if that makes them happy. You know, there's a reason it's such a 'convoluted' command, It keeps people from accidently executing it!.
  • by IvyMike (178408) on Tuesday March 13, 2001 @10:47PM (#365273)

    I thought my sarcasm was pretty good, thank you very much.

    Perhaps my bile was uncalled for, but I'm sick of people implying "good design is easy, why doesn't someone just do it?"

    Good design and usability are difficult. Do you think that the industry doesn't know that billions of dollars and instant fame and fortune are at stake here? Do you think that the industry doesn't try really, really hard to get that money?

    There's a right way to criticize usablity--one author who does it right is Donald Norman (I'm sure there are others, but on this topic I've only read Mr. Norman's books). He manages to carefully consider what is wrong with a designs, discusses the alternatives, and points out how usablity could be improved.

    There's also a wrong way. Say something "Why can't my computer be as easy to use as a toilet?" God, I'm getting pissed off again. What's the feature list of that toilet? And what's the feature list of your computer; can you even hope to cover that list in any amount of detail? In fact, does your computer actually even have a standing feature list, or do you actively install new software (and thus new features) constantly? Dammit, everybody who uses a computer has complex needs--I have a web browser, an email client, a remote telnet session, an mp3 player, and a "find file" all open RIGHT now, and I suspect that I'm pretty tame compared to the average slashdot reader. I'm going to play an online game with friends in another state shortly. I could spend hours describing what I want EACH ONE of these to do. I happen to think that all facts considered, the usability is pretty good. (And I might add: Damn, it's cool to live in the year 2001. This stuff rocks.)

    Are things perfect? Of course not. One company has a monopoly on the desktop market and has very little incentive to innovate (in spite of their claims the contrary) and every incentive to continue to keep the status quo. Yes, the "START" button is retarded. Should we strive to improve the state-of-the art? Of course. Would it be awesome if it was easier to do my taxes? Sure, but are you absolutely you want the automated solution you described when it sacrifices transparency (are you sure your taxes were done correctly in that system) and possibly privacy (who's keeping track of all that information flowing between your income-payers and the government?) I actually think that TurboTax made my taxes about as easy as I'd like--it asked me a simple set of questions, I answered, and it was done. Any easier, and I'm not sure I'd completely trust it.

    I actually don't know why you're arguing, since in at least one respect, you agreed with me. You said:

    Simplicity of interface, sheer useability, takes a lot of talent, skill and creativity.

    If you think about it, the article in question basically said these are all trivial, require little skill or talent, and they said it with a condescending attitude. It's actually really really hard. Dismissing the problem is unwarranted and deserves and equally scathing reply.

  • by IvyMike (178408) on Tuesday March 13, 2001 @08:15PM (#365274)

    Dammit, I hate these fuckers.

    First of all, they contradict themselves. "Computers are too hard," they whine, but when a computer interface remains consistent and usable for twenty years, "If Rip Van Wrinkle went to sleep in 1982 and woke up today, he'd be able to drive our modern computers with no problem because they're essentially unchanged".

    Second, they recommend creating "simpler" and "distributed" devices instead of monolithic boxes that do everything. What the hell does this mean, what devices really need more intelligence? All I can think of is one of those computerized thermostats. Whoopee.

    Look. Computers are complex because your needs are complex. Worse yet, my complex needs are inconsistent the needs of others. Try to download mp3s on your toaster. Try to do your taxes while downloading porn while instant messaging your friend in France while checking the weather on one of their great appliances. Try to use that "more intelligent than a computer" airport toilet to write up your Powerpoint slides, you pompus pricks.

    Actually, in this case, that might have actually worked.

  • by grammar nazi (197303) on Tuesday March 13, 2001 @08:14PM (#365275) Journal
    Not contributing anything useful?

    I just love it when Scientists fling mud and proclaim that the 'real-world' isn't science.

    In mathematics, we have the very 'real' Taguchi quality control that revolutionized manufacturing processes, but according to my Math professors, "It's not real mathematics, just some linear algebra application."

    On the topic of manufacturing, Metal can now be formed and machined into virtually any shape, Ceramics and metals can be mixed and then burned to form Aluminum tools (molds) for injection molding parts. "That's just an trick to sintering the ceramic" my ceramics professor told me.

    My point is that industry types, whether they are applying nueral networks to read handwriting or creating thinner flat panel displays, solve the same complicated types of problems that the more 'scientific' community solves. The scientific community discredits their work because "Theoretically it can be done, so why bother doing it." It's as though the companies that want to enhance their products by funding research shouldn't fund the research that is most likely to enhance their products!

    I'm sorry to sound harsh because this strikes close to home for me. I was on track for PhD, but quit and now I'm having a lot more fun developing optimized neural networks to do hand writing recognition.

  • by atrowe (209484) on Tuesday March 13, 2001 @08:30PM (#365276)
    I love my computer enough as it is. If I had a computer that sucked, I'd never leave the house!

"If a computer can't directly address all the RAM you can use, it's just a toy." -- anonymous comp.sys.amiga posting, non-sequitir

Working...