Scientists And Engineers Say "Computers Suck!" 251
drhpbaldy writes: "At the latest ACM meeting,
scientists and engineers threw mud at computer scientists for not contributing anything useful. They lambasted CS types for developing complex and useless technologies. Some of the fault was placed on VC's for funding only the fanciest and stupidest technologies." Of course, when people say that "design" will save the world, they usually mean their idea of design, which might not jibe with yours or mine.
Sinewave (Score:1)
The Rambler
Re:Secure Path Login/LogOut (Score:1)
Re:Computer scientists will rule the world (Score:1)
I could have gone to school for CS..but I figured why pay so much money getting a degree for something I could easily teach myself?
Instead, I got my Aerospace Engineering degree with an emphasis in Fluid Dynamics and Automatic Control Systems (lots of complex coding in those things, btw).
Computers are great tools, but I really doubt a CS guy could write the code I do...they simply don't understand the engineering stuff that needs to be calculated. Fluid dynamics is an art. you only learn how to solve the equation after lots of practice learning what can be dropped/substituted in the equations of motion.
Geesh, am I the only one? (Score:1)
These companies told the world that computers were ready to make their lives better. They made a lot of laughable statements that were, unfortunately, easy and desireable to believe. Then these companies mass marketed their products and made bundles of money. Imagine vulture, er, venture capitalists in 1910 saying "London to New York in 3 hours via plane!" This is what happened in the computer industry, and there has been a lot of dissapointment as a result.
Consequently, Intel's research budget grew very fast, evidently much faster than they could improve their designs by the look of things. However, the companies that were making real advances in processors have been pushed out of business (next week, we'll discuss whether the "efficiency" of capitalism is really the right economic principle to maximize
The term Artificial Intelligence (my research, sort of) is horrible, and has probably contributed to the disappointment. I don't think software techniques have matured much. Hardware and hardware processes have become much better -- memory densities, magnetic storage densities, even CRT manufacturing. But I really don't see any improvement in available software. At least with GNU/Linux, there's an attempt to find the right way to do things even if it takes several tries and isn't popular or financially rewarding.
The best thing that has happened, by my estimation, is the interconnection of computers. Networks have proven far more valuable than so many other technologies like speech recognition and vision. Those technologies are very, very interesting, and it's proper for people to study them. But natural language processing has not had an effect on how we get through each day, yet, despite hype from the market.
It's interesting, therefore to see how Microsoft, Intel, etc. hype the Internet. Watch how they try to spin their products to say that they add "value" to the Internet experience. An Intel P-MCMXXXI doesn't make the Internet any better. The important aspects of the 'net don't depend on Flash or Javascript, and certainly don't depend on Flash or Javascript executing on bajillion-megahurts processors. The Internet, the best thing to happen to come to the public from the tech sector (followed by handhelds, I think =-), is useful with or without Intel's latest and greatest. The internet is even better without Microsoft's latest and greatest Media-Hijacker. =-)
The Internet is valuable for the transmission of information. Computers are valuable for the processing of information in simple ways, really quickly. Neither of these create new information in any meaningful sense--we still need humans or nature or whatever for that. But none of this sounds exciting enough to sell computers, and as a result Microsoft and Intel, etc., created the hype that has led to a large disappointment. They preached the land of milk and honey but delivered a desert (I better watch out for lightning bolts after saying that...).
I like to say that these companies, and the whole PC industry, have been "taxing naive consumers." And now consumers are realizing that these companies have squandered their money. It is ironic, and slightly humorous if you've a strong stomach, that the academics are getting blamed.
-Paul Komarek
Re:It is not science, it is an art!! (Score:1)
And indeed, for a scientist, the code is not the thing that is important, its the idea! Imagine a very simple thing, the quicksort algorithm. I can implement it in many different programming languages, but the thing is still the same. (BTW, my personal favorite for this is Haskell, which is really beautiful code:
quicksort
quicksort [] = []
quicksort (x:xs) = quicksort [y | y =x]
).
Sebastian
Re:It is not science, it is an art!! (Score:1)
Sorry, some of the code got removed in the post...
Re:Convolusion isn't necessary. Try dialogs. (Score:1)
One thing I think I'd like to see is the rest of the app graying if a modal dialog box is invoked, making it clear that the dialog is "in charge."
Re:Secure Path Login/LogOut (Score:1)
Correct.
To keyboard-reset a VMware session, you must use C-M-Ins, instead of C-M-Del.
Rev. Dr. Xenophon Fenderson, the Carbon(d)ated, KSC, DEATH, SubGenius, mhm21x16
Re:Technology as an ends and not as a means (Score:1)
it comes down to licensing -- not technical issues. manufacturers to this day still have to pay the ``apple tax'' for each firewire port on their device, as opposed to USB where no such rediculous licensing is required.
MO never took off because the various disk manufacturers could never agree on common formats. ZIP was similarly doomed from the start since only one company manufactures it.
USB appears to be obsoleting serial and parallel for all practical purposes... it was showing up on PC motherboards before apple came out with the imac. it's becoming difficult to get a PC motherboard without USB. printers, scanners, mice, keyboards, even the beloved cd burners are now available in USB form, knocking out their bus-predecessors.
I personally think compactflash prices and densities will eventually improve to the point that they will replace the floppy. it just needs to get cheaper.
Scientists live in la la land (Score:1)
People are stupid. People asked for stupid stuff on their software. The software is stupid.
Poke fun all you want, but since the invention of photo-paper, science has contributed absolutely nothing to the important field of pornography distribution. Look how far we've come.
-the Pedro Picasso
--
internal FireWire connections (Score:1)
Re:CTRL-ALT-DEL (Score:1)
It might be fair to blame the industry for not making usability a priority, but it's generally a low priority for customers too, and companies prefer not to spend resources on features customers don't care about.
Re:Computer scientists will rule the world (Score:1)
I'd have to say that writing a well crafted piece of software is extraordinarily satisfying, but it doesn't come close to the satisfaction of having built a physical device. Unfortunately, physical devices are more expensive and time consuming to build. The way I see it, hardware design is the same as programming, only in a different medium. The advantage is the device you can show others when you've finished.
Of course, probably the most satisfying thing I've ever done was to build a device based on a microprocessor, and to then write all the software that ran on it. :-)
Re:Engineers V. CS (Score:1)
-Circuit Board Design Progs, Whiteboards, Email.. (Score:1)
All the scientists and engineers who agree with this article *must* stop using these obviously non-useful and pathetically convoluted software tools or risk being kicked in the
Have fun with your pencils and telephones.
Re:Convolusion isn't necessary. Try dialogs. (Score:1)
As if I didn't have better things to work on...
Re:Oh please... (Score:1)
From a Rhetoric perspective, which is better?
A) constitutionally based representative democracry with socialist education system and fascist military, etc.?
or
B) USA-Democracy (which implies all that)?
Re:Oh please... (Score:1)
Re:LAMEST. ARTICLE. EVER. (Score:2)
From Jonathan Walls
This is insightful? Does it not strike moderators as pathetic to see a knee-jerk reaction to criticism, laced with bad sarcasm, insults and poor logic, pandering to the tastes of the audience?
Especially in a community that likes to think of itself as intelligent and cutting-edge - you would have thought a bit more open mindedness could be expected. Anyone with the ability to see another person's point of view would acknowledge that using the Start button to stop, or requiring hardware knowledge to install an OS, and so on, is indicative of a situation that needs improvement. And remember this is criticism of an industry rather than individuals, so there's no point cherry-picking to prove a point.
As for "computers are complex because your needs are complex", that sound like a pissing competition i.e. "My needs are so complex you could never design a simple [set of] interface[s] to solve them. Gee, you must be pretty simple if you think it's possible for you." Then you get, "my complex needs are inconsistent the needs of others", or in other words, "I am such an individual that noone could ever produce a simple solution that suits me."
Personally, I want things to be simple, I'm not strutting around claiming to be a complex individual, with difficult to meet needs. For a start, such a person sounds like an arsehole. But more to the point, I have lots of simple needs. Take the example of doing my taxes - I don't want to, I want a simple process. After all, all the figures I provide are automated somewhere anyway, I don't want to expend any effort at all, I just want a simple solution. Such a solution would undoubtably have a complex back-end, take a lot of work if it's possible at all currently, and take some talented people to do it right. If I simply saw a one page print out at the end of the tax year with a breakdown of income and taxes I would be very happy (and rather impressed). Simplicity of interface, sheer useability, takes a lot of talent, skill and creativity.
If the only example of an intelligent device you can think of is a computerised thermometer, I wouldn't hold much hope of ever getting a good job requiring any of these skills.
Finally some truth at Slashdot! (Score:2)
The main problem is the toolkits/frameworks that are used for developing software. Most Unix toolkits really suck! What's even worse is that the language they are designed in, be it C or C++ makes such a mess, because those languages weren't designed for graphical interfaces, they are portable assemblers.
If the world programmed in Smalltalk life would be much easier. Imagine if everybody had truely reusable classes. Although maybe that would put some programmers out of work. Using a specific language doesn't mean that code reuse will be well done, a lot of it has to do with the programmer.
Maybe one of you has the idea that will push us past the archaic languages that we currently use.
Re:Brain dead device drivers (Score:2)
SCSI cards handle transfers, IDE makes the CPU do it. Intel wants to drive demand for CPUs, so do they push SCSI as the standard interface mfrs should use, or IDE?
Same with USB vs. FireWire. (not to mention the NIH syndrome).
Re:Looks like (Score:2)
Just because it is programmable doesn't mean a CS type programmed it.
I also believe that the ACM comments went too far but they do have a point, although I don't think that CS is all to blame.
So many people have the attitude that technology solves all of our problems. The thing is that technology seems to cause as many problems as it solves, and that blame can be spread everywhere, from consumers to politicians to VCs to designers to engineers to programmers, etc.
Re:Computer scientists will rule the world (Score:2)
Programs are abstract. The only reason we use them is that they can be made to work with non-abstract objects. The thing is, most programmers only have access to printers and monitors for output methods, which aren't bad but IMO that can limit the experience and capabilities of the programmer.
I am something of a programmer and something of an engineer, the benefit I get from that is that _I_ can make physical objects that are a result of my programs, or make electromechanical circuits that actually have real world interactivity.
Re:Finally some truth at Slashdot! (Score:2)
Re:Finally some truth at Slashdot! (Score:2)
It's a late binding language, meaning that all operations on methods, etc. are determined at runtime. Better get more muscle to push the app.
It's, generally speaking, an interpreted language- you're running on a VM, not unlike a JVM in almost all cases of a SmallTalk runtime environment. Better get even more muscle to push that app.
It's garbage collected, meaning it's going to do evil things to you when you're trying to do something time critical and it decides to do garbage collection (which you don't have control over- nor does the paradigm of SmallTalk allow for that.). Better hope garbage collection can be handled in another thread and you've a SMP machine to use for your app.
For some things, SmallTalk is great. For things like word processors, etc. it's a blessing.
For many systems tasks such as UI's (Not app UI, something more like X (Unix) or GDI (Windows) or OSes, it's a poor fit. There's other good fits and bad fits- and making SmallTalk change to fit the ill-fitting things better, you lose much of the benefits that the language brought to the table and you might as well have been doing the thing in C/C++/ObjectPascal/ADA95/etc.
As for truly reusable classes, SmallTalk doesn't make it magically so. It requires skill, even in SmallTalk to do that.
Re:Secure Path Login/LogOut (Score:2)
Re:Secure Path Login/LogOut (Score:2)
I agree that it doesn't solve all the problems, but if you can close one out of three security holes, that's better than closing zero out of three.
This can easily be solved... (Score:2)
Bill Buxton (Score:2)
But he's done very good work in making, say, Alias | Wavefront's software be very usable by artists. Technically minded artists, to be sure, but there is a level of intuitive access to the program that just isn't found in a lot of other packages.
*n
Re:Looks like... MOTORS (Score:2)
-your motor example
1) A motor and spin.
2) A motor can't do anything else.
-camera
My camera doesn't have a computer in it. Adding computers to a camera doesn't make it functionally better. The function is to image a scene, with a print as the end product.
-computer isn't that great for drawing/modelling
1) Show me the standalone word processor that is better than a PC at drawing.
2) Show me the standalone computerized drawing pen that is better at word processing than a PC
3) Sure the computer isn't the best at every task, but it's damn adequate at trillions of them, and that equals power.
And another point that I didn't bring up before: The engineers who claim that specializing the computer into single use devices will improve it are trivially wrong.
A computer can simulate *any* tool that exists. The only difference between the computer and the real world is that 1 to many ratio of form to function. If the engineers claim that a change of form will improve the computer, it is easy to show that the net result is a lot more forms, but no additional functions.
So what their argument boil down to is simple human factors. They want to make computers easy to use by making them work just like objects we already know.
So in the end we agree. Why do artists need computers to draw with? They don't. My point is that a computerized pencil won't change that, unless that computerized pencil is integrated into a device with quintillions of possible states: a PC.
Re:Martketroids, etc. (Score:2)
> commands such as the common "Alt-Control-Delete"
> sequence used to close a program or perform an
> emergency shutdown
So the engineers are getting all concerned about human factors? I guess I wasn't aware that they had traded in their pocket protectors and slide rules.
Re:Looks like... MOTORS (Score:2)
Re:Looks like... MOTORS (Score:2)
And, learn the definition of an open mind, please.
Re:Secure Path Login/LogOut (Score:2)
You're missing the point. The operating system can trap whatever key sequence it wants - it is the operating system, so all keypresses are processed by it first. Of all the key combinations available on a keyboard, MS chose to use the combination traditionally associated with rebooting the system.
Re:LAMEST. ARTICLE. EVER. (Score:2)
While smart peripherals would help, the real cause of the problem is poor software design. There are more than enough CPU cycles to do everything in a timely manner, but the operating system doesn't schedule the CPU correctly. Brain-dead device drivers also contribute to the problem.
Re:Looks like (Score:2)
OTOH, the potentials
Caution: Now approaching the (technological) singularity.
My PC keyboard has a RESET button right now! (Score:2)
Re:Computer scientists will rule the world (Score:2)
Au contraire, Rodney. Exactly the reason I left engineering is that no-one in their right mind was going to give me two million quid to make a fast ferry because some hung-over graduate thought it would have fantastic seakeeping. Computing, OTOH, if I think it could be good, I'll sit down and code it. Man, this is way creative.
Dave
DISCLAIMER: Sometimes you are going to have to make software to an engineering quality.
If you think about it, most electronic stuff sucks (Score:2)
Why do I have turn 5 knobs and push 4 buttons to make my home theater receiver/tv switch from dss to ps2?
Why do 90% of VCR functions only come on the remote, especially important stuff, like switch to the aux video source?
why does every piece of software come with crappy default settings?
why are we stuck with crappy interoperatability between anything? DV vs. D8mm, firewire vs. whatever, ide vs. scsi
i have a pda, cell phone, pager, email, etc.
I know I'm generalizing, but these 'engineers and scientists' are the same jerks who've been pushing shitty technology down our throats
my 2 cents
Re:Looks like (Score:2)
You mention a total of 3 products here - and they all have various reasons why they're superior. The DVD player is better on a TV normally because the screen is bigger. I would much prefer to watch a DVD on a monitor of the same size vs. a television of the same size, simply because of better resolution. So DVD players lose on this point in my opinion. And it's damn easy to watch a DVD on a computer - stick it in the drive and w/ AutoPlay it happens. Then it's easier to control the DVD it's self with a mouse then with a stupid remote control with a bad interface.
The other 2 devices are an issue of portability - and they'd be more powerful if only they could get more power into that same space, which will happen in time. Why carry around both a Palm and a mini MP3 player when you can carry around one device which is more robust?
Certianly you're not going to use your computer for a microwave, that's just ridiculous. But with a central computer that's powerful enough to not be upgraded every year and a half you have a lot more time to invest in add-ons. You get that DVD player, you get that huge monitor, you get those nice controllers to play games with that are just like your Playstation. And it all ends up cheaper.
Imagine - instead of buying a DVD player, a television, a device to surf the web, a CD player, a tuner (you'd still need an amplifier of some sort), and a game console you buy a computer. There's several devices all rolled into one. Who wants all that crap laying around their house when they can have a central computer which powers all of this? And why can't this same central computer power multiple monitors, etc... It's a great deal.
That doesn't get rid of the need for a portable computer, and your portable computer could even hook up to your central computer, but why carry around both a Palm and a MP3 player? Who the hell wants to do that? Why don't a throw a CD player and a tape walkman into my backpack just for good measure?
I think people often confuse the idea of the desktop computer going away and computers becoming integrated into our lives. Of course computers are going to become integrated more in our lives. That takes time though, before it becomes really useful we need omnipresent wireless access with omnipresent IPV6 so everyone's toaster can be on line (http://toaster.myhome.com).
But all together it's really annoying to hear scientists bitching about this stuff. Everyone's just under this delusion of internet-time and they think that the infrastructure of the world will change at that same rate. Infrastructure does not change over night.
Re:Problem between CS and other sciences (Score:2)
The end result is lots of apps that are interesting from a CS point of view but completely useless to the people that paid for it. Or alternatively, dreadful from a CS perspective, but actually useful to the biologists that wrote them.
This is, no doubt, a general problem in the experimental sciences, which increasingly rely on information technology for data analysis and programming.
Software is less malleable than people think (Score:2)
People are always full of good advice that's harder to follow than they think.
Making good software requires an intimate knowledge of the user that is often practically unobtainable until you have a nearly finished product. When you have a geekish user targetted, they can probably do a good job of describing their needs and reacting to generalized descriptions of UI approaches. Most normal people struggle to describe their needs and mentally "help you" to much by ascribing to the proposed software capabilities that bridge the gap between what you are describing and what they need. These people can only contribute well when you have a nearly workable user interface that they can actually work on and which you can observe.
Only when you have a pretty functional product do you get the user feedback you need to throw out your bad assumptions, rip out your bad code and start from scratch.
This is why RAD and rapid protototyping tools like VB, PowerBuilder and Delphi are useful. In my experience there's lots to hate about these things, but they do allow you to do a lot of experimentation with UI.
I'm working now on a vertical market application that everyone agrees is very powerful, but most people agree is in many places hard to use. I am gradually improving things as I get to know the users better, but it is hard work and very risky -- what one person likes may be hated by another.
The terms in which we sell software are a problem for us too -- push a button and whee! The world is at your feet. Improving user interfaces requires a considerable taste for crow.
As software designers, we use a lot of stock approaches to things, and our tools have support for these stock approaches. The problem is that they are often a poor fit for tasks as the users understand them. For example, most RAD application tools have pretty good support for "master-detail" type screens. These were designed to handle typical header/line-item forms like invoice-line item break downs. It's tempting to use them for all kinds of 1 to many relationship -- except that unless you are talking about accounting it's an unnatural fit to most tasks.
One interesting area I've been working on is PDA clients. It's been interesting because these stock approaches don't work on the PDA's limited screen real estate. This means that you absolutely have to go back to the drawing board and throw out the stock desktop approaches. In many cases the result is a more usable client. It's definitely inspired me to take a more clean sheet look at the knottier UI problems I have.
Jumping into a debate they don't understand. (Score:2)
The essence of the speakers' complaints was that computer engineers have spent the last five decades designing computers around the newest technology--not for the people who use the machines.
I think these folks are jumping into a middle of a huge cultural debate they don't understand.
That is the computer as tool vs. the computer as agent debate.
From your post, I'd place you in the computer-as-tool camp -- with the proviso that it is a novel and infinitely flexible kind of tool.
The people cited in the article are naively jumping on the computer-as-agent bandwagon.
I think a computer that understood what I wanted and did it for me would be a wonderful thing (if it didn't put me out of a job) provided we could build such a thing. But I think the longing for this has been created by the general abandonment of psychological and ergonomic principles as a guiding force to UI development in favor of stylistically driven designs (e.g. the put-it-in-a-toolbar movement of the early 90s and make-it-look-like-a-web-page movement of the late 90s).
You wrote:
The engineers are being engineers. Who can blame them? They like single purpose tools. Heck, we like single purpose tools too, and that's why we generally embrace the UNIX philosophy of making a program do one thing, and do it well.
Having built a number of pathologically flexible interfaces myself, I can say with some authority that normal users want tools that do one thing well too. When a user wants to twaddle the flim-flam, he wants to click on the flim-flam and get a pop-up menu that does twaddle (an object centric design); or he'll live with a menu choice called "twaddle" that allows him to select the flim-flam as the target (a functional design). What he doesn't want is a tool that allows him to construct a search template that will match the flim-flam and compose a series of operations to accomplish twaddling.
In other words, the user doesn't want to think about the tool you build, he wants to use it to accomplish his ends with the minimum of superfluous thought.
What these guys are really craving are not intelligent tools, but intelligently designed tools. They're thinking on this issue is just fuzzy because they're coming in late:
"If Rip Van Wrinkle went to sleep in 1982 and woke up today, he'd be able to drive our modern computers with no problem because they're essentially unchanged,"
Since when can usability be argued as a sign of unusability? If Rip Van Winkle woke up from the 19th c., I could show him how to use my favorite power tool -- the sawz-all -- in about ten seconds. That is because its design is perfect for what it is supposed to do.
Re:Convolusion isn't necessary. Try dialogs. (Score:2)
Isn't that what ctrl-alt-delete is for?
Usually, in that case, the mouse will still work in Windows or X. In Linux I hit ctrl-alt-esc and my pointer turns into a jolly-roger. I then click on the misbehaving window. If your mouse won't move, you can either hit that reset switch (I hope your FS does journalling) or, in Linux, hit alt-sysrq-s, alt-sysrq-u, then alt-sysrq-b. That is, in order, sync all FS, unmount all FS (actually remount RO), and boot.
Either way, modal dialogs will not work in many cases and you'll have to go to lower levels to recover somewhat cleanly.
If there was an LCD and a couple of buttons on the front panel, however, I would fully support a confirmation.
Re:not contributing anything useful? (Score:2)
jibe - see gibe
gibe - intransitive senses : to utter taunting words. transitive senses : to deride or tease with taunting words
Which is where the confusion comes from. In the sentence "I jibe him and you jibe with me while the yacht jibes" each jibe has a different meaning (taunt, agree, and change course respectively).
Lots of engineers ARE programmers.. (Score:2)
Um, not to burst anyone's bubble here, but most of my graduating EE/CompE class of 2000 is employed directly or indirectly as a programmer. What they program typically isn't windows, but more often than not embedded systems, control systems, etc. There _are_ software systems where instability is not an option, period - are you being fatalistic when you say that bugs are par for the course?
Engineering was a great choice for a basis of a primary software-based career; Getting to build a computer from being tossed some ram, a CPU, a latch and some miscellaneous components was great experience and helps when you actually write software for a machine you didn't build (which in 99/100 cases is what actually goes on). It also leaves open the pure hardware side of the world too, in case the software industry blows up (which might happen, who knows).
Engineer and Programmer are not mutually exclusive. This is also being posted from Canada, where a MCSE isn't enough to call yourself an engineer, either. :)
Re:Looks like (Score:2)
Nah, all we really want is a standalone, networked hard drive that any of our separate devices can connect to/disconnect from while running.
Re:LAMEST. ARTICLE. EVER. (Score:2)
Re:LAMEST. ARTICLE. EVER. (Score:2)
Thanks for clearing that up. I got confused by my "responder's" confusion over AGP and USB... However, if I may be just a bit too picky, I think that AGP really is a port and not a bus. There can only be two devices ever using an AGP connection: the host CPU (through the chipset) in one end, and the graphics adapter in the other. You can't add another device to it. The 'P' in AGP is for "port". Still, in everyday use it "feels" quite a lot like PCI, so...
Also, I do think that "original" plain vanilla PCI operates at 32 bits, and 33 MHz for a total bandwidth of ~132 MB/s. There are versions (I don't know their exact names) that use 64 bits and/or 66 MHz, though.Needless complexity bagged the VC buck$? Er, no. (Score:2)
1) theman.com received $20,000,000. Rather than suffering from needless complexity, it suffered from needless simplicity. (A website that advised Gen-X age men on good gift ideas for moms, or free pickup lines for cheap chicks?)
2) boo.com received $130,000,000. Their website suffered from needless complexity, but one could hardly say it was the fault of computer scientists (unless you consider flash animators and guys who sorta know javascript as computer scientists).
3) DEN received $60,000,000. They made 6 minute long short films targetting demographics supposedly ignored by television programming (latino gangland culture, teenage christian dungeon & dragon players, drunken fraternity morons, etc.). Needless stupidity, to be sure. Anything but complex.
4) Eve.com wasted $29,000,000 of VC money to build an ecommerce site for cosmetics and other ephemera for females. (The pitch to the VC, Bill Gross of Idealab, took 90 minutes, and didn't involve any computer scientists)
5) iCast.com cast $90,000,000 at streaming video. They're dead, too.
The list goes on and on. There is over a quarter of a billion above thrown at companies founded not by computer scientists but by:
A poet & an ex-model, a couple of ex-hollywood honchos, previously unemployed MBAs and other non computer scientist types.
FWIWIA.
Re:Secure Path Login/LogOut (Score:2)
1) If you're using a public terminal or something similar, the people who provide it can probably just record the keystrokes at the keyboard level.
** Third-party hardware cannot be made secure by the addition of code to one component **
2) If this is your private system which has become compromised, secure login info is the least of your worries.
** Local machines are not made secure by the addition of code to this one area **
This is a really old, mostly useless standard left over from the rainbow book series. It looks good on paper, but won't get you far in the real world.
--
Problem between CS and other sciences (Score:2)
-Moondog
Convolusion isn't necessary. Try dialogs. (Score:2)
Re:CTRL-ALT-DEL (Score:2)
In the M$ world, c-a-d is the most powerful incantation, only to be used at times of great stress. Compare init(8). Admittedly, init is too great a God to involve himself in starting a user's session.
Re:ANGRY DENIAL! - WARNING OF GOAT SEX LINK (Score:2)
Computers are Aliens and Abstract Testing Devices (Score:2)
It's easy to criticize modern computers, as their user interface is not modern. Designing a legacy human interface was a calculated decision however. People are accustomed to the windows (as in the object, not the MS software) interface, and when things change people get scared. When people get scared, money stops flowing.
From a human interface standpoint, computers might as well be aliens from another planet. We taught them to speak to us with windows about 20 years ago (don't nitpick time with me :) and now that is the de facto standard. Computers that don't "speak that language" are considered toys in the public eye (see PS2, furbies, games on cell phones).
The essence of the speakers' complaints was that computer engineers have spent the last five decades designing computers around the newest technology--not for the people who use the machines.
I don't think it is appropriate for them to suggest computer interfaces have become obsolete because no one was paying attention, or because no one cared to advance the interface. On the contrary there is a great deal of research on the subject, any computer science student has probably taken a human interface course or pieces therein (I did).
I think another big problem is that it's posh to be one of the "tech elite" in the business world. Someone who can handle their computer is generally considered more skillful, and seems to have more potential than one who can't. Logically this is because they are able to learn new things, and have no difficulties with abstraction. That is important in business, and in life.
Anyone agree?
as it's criticize-the-grammer-nazi day (Score:2)
--
Hrm.... (Score:2)
You know, when i want my computer to shut down, i just type "shutdown."
maybe i want to reboot the computer....i type "reboot"
I don't think most Scientists are wrong for flaming the computer industry, but there is innovation out there....they're just looking in the wrong places
FluX
After 16 years, MTV has finally completed its deevolution into the shiny things network
Funding only stupid techonologies? (Score:2)
---
Re:Martketroids, etc. (Score:2)
I guess the point is that while it may take the intelligence of a rocket scientist to run some systems, the rocket scientists would rather be working on rocket science, not computer science.
;-)
Martketroids, etc. (Score:2)
To a large degree, even though it is not named, well, for example there is this bit:
Targets of the critics' scorn included convoluted commands such as the common "Alt-Control-Delete" sequence used to close a program or perform an emergency shutdown. They also lambasted computer designers who refuse to distribute the machines' intelligence to smaller devices scattered throughout the home, instead insisting on packing a single box with maximum functionality.
Strangely, this sounds rather familiar. Certain large companies will not be named. They do not have to be. The marketroids have strangled the future.
Re:CTRL-ALT-DEL (Score:2)
You know, there's a reason it's such a 'convoluted' command, It keeps people from accidently executing it!.
I think that's hardly the point.
The point is that Ctrl-Alt-Delete is totally nonsensical from the general user's perspective. Why on earth should that mean "reset?"
My choice of solution would be a reset button that you have to hit two, or maybe three times, in close succession.
You wouldn't even need to document it; I guarantee you that, when a single push doesn't work, every single user will respond by hitting it repeatedly, and before long, they'll realize that you need to hit it more than once.
--
Re:Funding only stupid techonologies? (Score:2)
"Everything that can be invented has been invented."
Re:not contributing anything useful? (Score:2)
...and DON'T FORGET TO CAPITALIZE YOUR I, MISTER!!!! ...and it's '16-year', not '16 year'.
Re:not contributing anything useful? (Score:2)
jibe: To be in accord; agree: Your figures jibe with mine.
jive:
1. Music. Jazz or swing music. The jargon of jazz musicians and enthusiasts.
2. Slang. Deceptive, nonsensical, or glib talk: "the sexist, locker-room jive of men boasting and bonding" (Trip Gabriel).
I'll let you decide which version that our friend timothy meant.
From our friends at dictionary.com [goatse.cx].
Re:The real problem is... (Score:2)
Well put! Why do people go to garages to maintain their cars, but don't ever hire someone to maintain their computer?
Laugh at me: but I helped tons of people by administering their machines....for the little price of 4 six-packs per consultation. I would have done it for free, but most people want to give you something for your trouble.
Actually, where I live I have seen computer-troubleshooting companies for the case individuals have got problems with their computer. (Actually I think it is too late by then...a properly maintained computer has no problems.) I don't know what they are worth, but it seems to be working out quite well since they still are in bussiness. Not everyone seems to know a nerd to help them, it seems.
Besides, I think that using a computer is like driving a car...you need a minimal training to know what you are doing. (Don't flame me about people-safety: I know that badly using a computer can't hurt people, but badly using a car can...) Training complete newbies is very hard: I started to teach my mother, who never touched a computer in the past 50 years, to email with Eudora. Working with the mouse/keyboard all together causes great problems to her. I can imagine she is not the only one out there.
Re:Looks like (Score:2)
Strictly speaking, that is also incorrect - said computer can achieve said task if it is allowed to run unhinder for an infinite time :)
--
Re:But think of all the innovations! (Score:2)
Funny... I always thought Porn was invented by lawyers, since they seem so focused on screwing everybody else, and recording it for posterity.
But hey, at least the Internet was invented by a non-computer scientist, Al Gore!
Star Trek and Voice Recognition (Score:2)
- I dont know about you, but I can usually type faster than I can talk.
- Imagine yourself speaking to your computer for 8 hours straight, 5 days a week. Heck, I doubt even professional speakers do that sort of thing.
- A room full of people typing and clicking away is slightly noisy. A room full of people talking to their computers would be quite stressing.
So, all in all, i'm ok with using keyboard and mouse to work on the computer. Now, what I'd really would like to see in reality would be a functioning Holodeck. Playing VR-Quake would be sooo cool!
Re:Something doesn't add up! (Score:2)
Re:LAMEST. ARTICLE. EVER. (Score:2)
I did not spend another $100 upgrading the OS to support it. I also did not pirate a copy of OEM Win 95 or 98 to support it. I jerked it out and went to a PCI card instead and just blaimed it on the OS and WIN hardware. I learned the hard way that the AGP port is a USB device and the original Win 95 does not support USB, even with all the service packs installed. By the way, Linux supports it ;-). I later upgraded the OS to Linux.
Re:Brain dead device drivers (Score:2)
Re:LAMEST. ARTICLE. EVER. (Score:2)
We also have a one-up in the design process. (Score:2)
Now, I'm not accusing anyone. I'm not saying all software developers are out to screw over the hardware people, but look...
Those who write the software are the last stage. Regardless of how well the engineers designed the hardware, the CS people can either make or break their designs with good or bad code respectively. CS people essentially have engineers at their whim.
So yes, I certainly agree they're jealous... but in more than one way. They're jealous because CS people, in a way, have more power over the flow of technology.
Re:Convolusion isn't necessary. Try dialogs. (Score:2)
Taguchi method???? (Score:2)
Re:LAMEST. ARTICLE. EVER. (Score:3)
AGP, PCI, USB, IEEE1394, ISA, EISA are all busses.
AGP is an design extension of the PCI bus which allows for convienient memory mapping (Allowing host memory to be used for video mem, pooling and locking), different clocking, and different DMA strategies. Think of it as an extended PCI specification.
PCI was a complete redisign of EISA, with particular interest in bus speed, and wider bus transfers. Best of all was autoconfiguration of IRQ, DMA, and port mapping. PCI operates at 66MHz.
USB = Universal Serial Bus. It is a chained 4 wire serial bus that has much more in common with ethernet than with AGP. It's basically a transmit/receive bus. IEEE1394 is very simular.
EISA and ISA are old standard busses which oftentimes required hardwired IRQ, DMA, and IO ports (because of it's inability to autoselect empty slots and lack of a decent bus controller. These were typically 8, 16 and (EISA)32 bit busses. And they were way slow, operating at 4 MHz or so.
So there you have it.
Pan
Re:Looks like (Score:3)
I prefer a standalone DVD player to a PC. I prefer to use a Palm for storing addresses. PCs, even notebooks, don't carry around very well. I'd prefer to carry a mini MP3 player around than to carry a PC around. I prefer a PlayStation for many games over a PC.
I'd prefer it if my microwave had it's own embedded computer for timing, rather than having to hook a PC up to it in order to cook up my KD.
Judging by sales, I'd think the general public agrees with me, too.
Fact is, it's simpler to just hit a single button on a separate physical device than it is to hit a bunch of buttons on one. It seems that many programmers completely forget about ease-of-use on a physical level.
Of course, I'm just a grumpy old engineer, and an embedded one at that. I guess I'm the guy you're all rallying against right now...
Re:Looks like (Score:3)
Ya, it would be great if there was a standalone word processing device that I could go to, to do my papers. And then, one right next to it that would do spreadsheets. And next to that, one that would check my email. And one more, with a nice big monitor, to browse the web! Seems kind of wasteful, we should just make all these functions on one device? Wouldn't it also be really cool if said device could play my mp3's, or play games, or play dvd's? Oh wait.......
I don't buy this multiple device idea. While it might be true that the devices you mentioned are doing well in sales, arent they a little more specific in purpose then the tasks I mentioned? The PC has lasted this long due to its general applicability to a slew of applications.
Also, which one of your devices (aside from the playstation) would be worth the plastic it was made out of without a PC it could dock/communicate/exchange-data with?
Of course, I'm just a naive young software-developer, and I'd be out of a job if not for the PC.
--
Oh please... (Score:3)
I dont know why they would say such a stupid thing... I'll assume we all took what they said out of context/too seriously.
--
Re:CTRL-ALT-DEL (Score:3)
"Control-Alt-Delete, but wont that stop it?"
Mark Duell
Good design... (Score:3)
No timothy, when they say "design", I beleive they are referring to things like usability testing. In other words, taking a software package to groups of users, and designing statistically sound experiments to see what users find easy and fast to use. In other words, users ideas of good design - not yours, not mine.
If you're interested, maybe read some [asktog.com] sites [useit.com] on design. [acm.org]
Moreover, I think they are also saying that VC's should at least be aware of what theoreticians are thinking about so they make better use of their investor's dollars
Technology as an ends and not as a means (Score:3)
One more example (this time in the present), firewire. Apple, one of the few companies to move computer technology ahead (despite all of its numerous business/PR flaws) has started putting internal firewire buses in their computers. Why didn't any other computer/motherboard companies think of this? Don't they understand that firewire cables are far less of a hassle than ribbon cables, and block airflow far less? Don't they reckognize the ease of use of being able to chain FW drives together? Don't they understand that external firewire is probably the easiest way for non-geeks to add new hardware (without the need to buy hubs)? But where is intel? Where is Western Digital? Where is Seagate, or Asus, or Abit, Tyan, or any of the others? Nowhere, that's where. In fact, they barely put any stock in USB. Rumor has it that when apple announced that it was killing serial and replacing it with USB, an Intel executive called Steve Jobs to thank him for taking the bold move "Getting all the others [OEMS] to go to USB was like herding cats".
To capitalize on the obvious pun, technology sucks because too many people are pussys
Re:Computer scientists will rule the world (Score:3)
What are you talking about? Programming (for me, anyway) is ALL about the satisfaction from building something useful and the artistic delight of design - in programming, you build something from quite literally nothing - you create order from chaos. Programming is speech, but it's much more than that - to be a good programmer, you have to think in abstract ways and be able to think truly dynamically - static thinkers have no places in the art of programming. Anyone who says they are programming for *just* money is NOT an artist. Good code is truly poetry, and good programmers are truly artists.
Hypocritical.. (Score:3)
Funny..computers appear to be useful enough to give PowerPoint presentations to a crowd to quickly and easily present information to a large group. I find it a bit hypocritical they'd bash computer design and ease of use and use PowerPoint instead of some other presentation medium.
Re:Looks like (Score:3)
that's why I'm changing my major (Score:3)
In explaining such issues to friends not familiar with the industry, I'll often draw parallels to similar situations. With this one, I'd say the computer craze is now at the point the car craze was in the late 1960's. Hobbyists are still common but on their way out. More and more people want the physical ideas of the technology eschewed for its practical purposes. Perhaps this economic turn is analogous to the oil crisis. (and quite similar. I've heard that at least some of is due to the californian legislator and power companies scratching each others back to create the energy crisis out here. Personally, it wouldn't surprise me, since I feel absolutely no trust toward the motives of either group)
---
Re:LAMEST. ARTICLE. EVER. (Score:3)
Seriously...I just have visions of what would happen if my appliances started communicating with each other...
Fridge: Ok everyone, we know Alex has a final tomorrow at 8am.
All Kitchen Appliances: *evil laughter*
Fridge: Everybody turn on in 3...2...1...NOW!
*All appliances in the house turn on at once*
*Circuit breaker trips*
(At 11am the next morning)
Alex: NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO!
You know it'll happen. One day. When you least expect it. They'll turn on you.
Looks like (Score:4)
Distributed systems are a nice thing in principle, but some problems can be broken up only so far. And then when it comes down to it, nobody wants to buy a specialized piece of a computer when they can get their generalized computer to work.
Look at the history of tools. First there were separate tools, each one doing a single or small number of jobs. Even a Swiss Army Knife was limited to about as many tasks as it had specialized attachments.
People like to poo poo the computer as being "just another tool". But the computer is far far different than any other tool that came before. The computer has the ability to be an INFINITE (or at least huge enough that you won't exhaust the possibilities in the lifetime of the universe) number of tools.
The engineers are being engineers. Who can blame them? They like single purpose tools. Heck, we like single purpose tools too, and that's why we generally embrace the UNIX philosophy of making a program do one thing, and do it well.
But the difference is that our specialization is in the software, and the specialization they are proposing is in the hardware. If I want a single purpose tool, I don't need a computer to get that.
ANGRY DENIAL! (Score:4)
Supporting claim. Second supporting claim.
Revelation of inconsistencies in the complaints.
Setup for attempt at witty attack on academics.
Punchline of witty attack.
---
Secure Path Login/LogOut (Score:4)
It's too bad Microsoft couldn't build applications the same way, safe from Trojan Horses.
--Mike--
Computer scientists will rule the world (Score:5)
Engineers are jealous of programmers. It's that simple. Programmers have an easy life, after all. I only work a few hours a day, get paid big bucks, and for what? For telling a machine what to do. For the act of mere speech. It's Rapunzel's tale incarnate: I spin words into gold.
Engineers have too many constraints; the standards are too high. When the Tacoma Narrows bridge fell down, heads rolled. But when a bug in the latest version of NT disables an aircraft carrier, Microsoft doesn't get blamed at all. Bugs are par for the course in our industry, and we have no intention of changing it. It means higher profits from fixes and lower expectations. How are engineers supposed to compete with those sorts of odds?
I admit I considered going into engineering when I started my college days, but I was quickly disuaded. The courses were too involved, whereas the CS courses were a breeze for anyone who didn't fail calculus. And I don't regret it at all, really.
Programmers might not get the satisfaction of building something useful and might not experience the artistic delight of design, but we at least don't have to work as hard. And when it comes to the bottom line, that's all that counts.
Reason for stupid tech is IP law blocks code reuse (Score:5)
That's because when someone comes up with a useful technology, even something as simple as LZW compression or an MP3 encoder, NO ONE ELSE CAN USE IT in their product. Writing products that use someone else's file format is called a "copyright violation". Standardising on one crypto algorithm is called patent theft. CPUs with compatible MMX instructions gets you sued by Intel. Making DVDs playable on "non-approved" systems gets you jailed, or orders from people halfway around the world.
So yeah, "CS types develop complex and useless technologies." because we have to carefully avoid reinventing someone else's wheel or we get sued into bankruptcy.
One result is millions of different wheels of different diameters, shapes and track widths that are all incompatible with one another. Sounds pretty messy, right? It also happens to resemble what we see today in the computing industry.
The other result is people getting fed up with all the incompatibilities and looking for a standard, any standard. And since the standard is proprietary, naturally this will favor the growth of monopolies, e.g., Microsoft, who thes uses their position as OS "standard" to create other standards, such as Excel and Word formats, whilst actively blocking anyone else from participating in that standard.
IMO both patent lifetime and copyright lifetime ought to be cut to 10 years tops for all things computing related, hardware or software, because stuff in this field ages faster than any other traditionally patented and coyrighted work.
And there needs to be an irrevocable expiration for abandoned patents and copyrights too. It's absolutely insane that Atari 2600 games are still locked away by copyright, while no one is prodcing them. And they'll be locked away for over a century under current IP law. Is this right?
BAH! (Score:5)
Oh that's right, you ARE still writing in Fortran. My bad.
"Too easy" shutdown procedures (Score:5)
The RESET key, located at the top-left corner of the keyboard, triggered a software reset. This had the effect of (depending on the software you were using) terminating the program and dumping you back to a BASIC prompt or erasing whatever unsaved data you had or doing a hard reboot of the machine. Users quickly found out (the hard way) that this button was way too easy to press by accident. In fact, this problem was so pervasive that magazines such as Creative Computing began advertising for "RESET key protectors"
In later versions of the Apple II/II+ (and in subsequent machines such as the IIe,
CTRL-ALT-DEL (Score:5)
Put it under F1, see if that makes them happy. You know, there's a reason it's such a 'convoluted' command, It keeps people from accidently executing it!.
Re:LAMEST. ARTICLE. EVER. (Score:5)
I thought my sarcasm was pretty good, thank you very much.
Perhaps my bile was uncalled for, but I'm sick of people implying "good design is easy, why doesn't someone just do it?"
Good design and usability are difficult. Do you think that the industry doesn't know that billions of dollars and instant fame and fortune are at stake here? Do you think that the industry doesn't try really, really hard to get that money?
There's a right way to criticize usablity--one author who does it right is Donald Norman (I'm sure there are others, but on this topic I've only read Mr. Norman's books). He manages to carefully consider what is wrong with a designs, discusses the alternatives, and points out how usablity could be improved.
There's also a wrong way. Say something "Why can't my computer be as easy to use as a toilet?" God, I'm getting pissed off again. What's the feature list of that toilet? And what's the feature list of your computer; can you even hope to cover that list in any amount of detail? In fact, does your computer actually even have a standing feature list, or do you actively install new software (and thus new features) constantly? Dammit, everybody who uses a computer has complex needs--I have a web browser, an email client, a remote telnet session, an mp3 player, and a "find file" all open RIGHT now, and I suspect that I'm pretty tame compared to the average slashdot reader. I'm going to play an online game with friends in another state shortly. I could spend hours describing what I want EACH ONE of these to do. I happen to think that all facts considered, the usability is pretty good. (And I might add: Damn, it's cool to live in the year 2001. This stuff rocks.)
Are things perfect? Of course not. One company has a monopoly on the desktop market and has very little incentive to innovate (in spite of their claims the contrary) and every incentive to continue to keep the status quo. Yes, the "START" button is retarded. Should we strive to improve the state-of-the art? Of course. Would it be awesome if it was easier to do my taxes? Sure, but are you absolutely you want the automated solution you described when it sacrifices transparency (are you sure your taxes were done correctly in that system) and possibly privacy (who's keeping track of all that information flowing between your income-payers and the government?) I actually think that TurboTax made my taxes about as easy as I'd like--it asked me a simple set of questions, I answered, and it was done. Any easier, and I'm not sure I'd completely trust it.
I actually don't know why you're arguing, since in at least one respect, you agreed with me. You said:
Simplicity of interface, sheer useability, takes a lot of talent, skill and creativity.
If you think about it, the article in question basically said these are all trivial, require little skill or talent, and they said it with a condescending attitude. It's actually really really hard. Dismissing the problem is unwarranted and deserves and equally scathing reply.
LAMEST. ARTICLE. EVER. (Score:5)
Dammit, I hate these fuckers.
First of all, they contradict themselves. "Computers are too hard," they whine, but when a computer interface remains consistent and usable for twenty years, "If Rip Van Wrinkle went to sleep in 1982 and woke up today, he'd be able to drive our modern computers with no problem because they're essentially unchanged".
Second, they recommend creating "simpler" and "distributed" devices instead of monolithic boxes that do everything. What the hell does this mean, what devices really need more intelligence? All I can think of is one of those computerized thermostats. Whoopee.
Look. Computers are complex because your needs are complex. Worse yet, my complex needs are inconsistent the needs of others. Try to download mp3s on your toaster. Try to do your taxes while downloading porn while instant messaging your friend in France while checking the weather on one of their great appliances. Try to use that "more intelligent than a computer" airport toilet to write up your Powerpoint slides, you pompus pricks.
Actually, in this case, that might have actually worked.
not contributing anything useful? (Score:5)
I just love it when Scientists fling mud and proclaim that the 'real-world' isn't science.
In mathematics, we have the very 'real' Taguchi quality control that revolutionized manufacturing processes, but according to my Math professors, "It's not real mathematics, just some linear algebra application."
On the topic of manufacturing, Metal can now be formed and machined into virtually any shape, Ceramics and metals can be mixed and then burned to form Aluminum tools (molds) for injection molding parts. "That's just an trick to sintering the ceramic" my ceramics professor told me.
My point is that industry types, whether they are applying nueral networks to read handwriting or creating thinner flat panel displays, solve the same complicated types of problems that the more 'scientific' community solves. The scientific community discredits their work because "Theoretically it can be done, so why bother doing it." It's as though the companies that want to enhance their products by funding research shouldn't fund the research that is most likely to enhance their products!
I'm sorry to sound harsh because this strikes close to home for me. I was on track for PhD, but quit and now I'm having a lot more fun developing optimized neural networks to do hand writing recognition.
Re:Funding only stupid techonologies? (Score:5)