Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Technology Hardware IT Science

The Biggest Roadblocks To Information Technology Development 280

ZDOne writes "ZDNet UK has put together a list of some of the biggest obstacles preventing information technology from achieving its true potential, in terms of development and progress. Microsoft's stranglehold on the desktop makes the list, as does the chip-makers' obsession with speed. 'There is more to computing than processor speed -- a point which can be easily proven by comparing a two-year-old PC running Linux with a new PC buckling under the weight of Vista. Shrinking the manufacturing process to enable greater speed has proven essential, but it's running out of magic ... What about smarter ways of tagging data? The semantic web initiative runs along these sorts of lines, so where is the hardware-based equivalent?'"
This discussion has been archived. No new comments can be posted.

The Biggest Roadblocks To Information Technology Development

Comments Filter:
  • 0. Lack of (artificial) intelligence (still)

    More specifically, lack of ability of applications (or lack of applications able) to adapt to the needs of the individual user automagically (top of my wishlist: a memory crutch).

    /satire This will be fixed once evil&co realize that such a 'profiler' is a well performing surveillance tool while at the same time realizing that 'progress' that is purely driven by the technologically feasible does not cut it.

    CC.
    • by crgrace ( 220738 )
      There will never be any "artificial intelligence" because once a computer achieves something that was considered "artificial intelligence", the phrase is redefined to exclude said achievement. So, by definition, artificial intelligence is impossible. In fact, once something that was considered to require intelligence is achieved, it is then relegated to rote "pattern matching".

      Examples:
      Adaptive filtering
      Chess
      Music composition
      Real-time target acquisition and tracking
      • Sounds like something I heard about human intelligence. People want to keep on thinking that we are different from the rest of the animals, so they come up with stuff that they think only humans can do, and therefore makes us different from the other animals. Many things have been struck down, such as use of tools, self awareness, emotions such as love and sympathy, high level communcation, and many other ideas that have been shown to exist in animals.
        • high level communication
          In animals? Really? From what I recall of my linguistics and psychology, language is reflexsive, arbitrary and symbolic. Which animals have this kind of communication?
  • Horrible (Score:5, Insightful)

    by moogied ( 1175879 ) on Tuesday November 27, 2007 @11:08AM (#21492175)
    The author clearly has no idea what they are saying.

    We haven't come far. Qwerty is 130 years old, and windows, icons, mice and pointers are 35. Both come from before the age of portable computing. So why are we reliant on these tired old methods for all our new form factors?

    We are reliant because they work damn good. Its not like they were the simpliest of ideas, they were just the ones taht stuck because they worked.

    • by SmallFurryCreature ( 593017 ) on Tuesday November 27, 2007 @11:19AM (#21492333) Journal

      Just because something is old does NOT mean it is obsolete, more and more I see this as an absolute truth, advancing (oh okay, runaway) age has nothing to do with it.

      Some things just work and don't really need to be replaced. Change for change sake is bad. NOW GET OF MY LAWN!

      • Exactly. We could have replaced paper books by now with a small e-reader device, but really, books work a lot better than computer screens for reading in a lot of ways. Even with everyone carrying around a laptop, you'll still see people reading paper books, because its the best way to do it.
        • Personally I'd read a lot more ebooks if more ebooks were available without the restrictive DRM and hardcover prices as compared to paperbacks.

          I'm not going spend $400 [amazon.com] and $20/book*.

          Though I'll admit to considering it as long as I can transfer my webscription [webscription.net] ebooks to it.

          *Yes, they advertise "New York Times® Best Sellers and all New Releases $9.99, unless marked otherwise." The whole 'unless marked otherwise' is real assuring. Besides, I don't normally read best sellers, and pay less than $10/book.
          • Exactly, it takes alot of paper books to make the cost of the electronic book reader worth the price. Not only that, but the price of books is actually a reflection of how much people are willing to pay for them, not how much it actually costs to produce a copy of the book. I think the same would hold true for eBooks. eBooks wouldn't be any cheaper than paper books, and you still have to buy the reader. With paper books, you can resell them easily. You don't have to worry about backing them up. If you
      • Re: (Score:3, Insightful)

        That's a good point that too many people in the computer industry have yet to grasp, but there are some old, simple technologies that are really past their prime and survive on inertia alone. The example given above of a mouse and cursor is a pretty good one. I'm quite sure that, given a well designed user interface, I could be far more productive with a multi-touch screen as a pointing device than with a mouse. The problem is that that would completely change the ergonomics of computer workstations and use
        • Re: (Score:3, Insightful)

          by sumdumass ( 711423 )
          I don't think you thought that threw enough.

          Using your arms and fingers to point in a screen is already a reality. I have had touch screens on monitors for a while and you don't realize how much energy you end up exerting in something as simple as playing a game of solitaire. If you had to do your entire computing like this, you would be wanting the mouse back really fast. If your mouse is set up right, you shouldn't even have to pick your wrist up to move the pointer anywhere on the screen. it is loads mor
    • Re: (Score:3, Informative)

      by barzok ( 26681 )
      QWERTY is a holdover from the early days of mechanical typewriters, meant to slow typists down. It was not designed to be "good" for modern use.
      • Re:Horrible (Score:4, Informative)

        by ajs318 ( 655362 ) <sd_resp2@@@earthshod...co...uk> on Tuesday November 27, 2007 @01:47PM (#21494331)

        QWERTY is a holdover from the early days of mechanical typewriters, meant to slow typists down.
        That is a blatant lie. The QWERTY layout wasn't meant to slow typists down -- quite the opposite. It was meant to ensure you could type as fast as possible, by separating commonly-paired letters. In order to type a word, every type-bar had to move through the same place -- creating a potential for jamming up the machine if the next one arrives before the last one has left. The further apart any two type-bars were, the more likely the type-bar for the first letter would have fallen out of the way before the type-bar for the second letter moved into place.

        Unfortunately, they mucked up. The word lists used to arrange the keys were all in the present tense, and so "e" ended up next to "d".
    • Re:Horrible (Score:4, Interesting)

      by langelgjm ( 860756 ) on Tuesday November 27, 2007 @01:31PM (#21494127) Journal

      We are reliant because they work damn good. Its not like they were the simpliest of ideas, they were just the ones taht stuck because they worked.

      They may work "good", but don't forget that good is often the enemy of better. How much of the reason we stick to improving old technologies is because of familiarity, inertia in R&D, and lack of imagination? Probably more than we can imagine, which is itself part of the problem.

  • by suso ( 153703 ) * on Tuesday November 27, 2007 @11:08AM (#21492181) Journal
    I'll say it but it isn't going to do any good anyways.

    One of the big roadblocks is users not seeing the big picture or not caring. Over the years, I've seen so many programs (especially open source) get off track of their goals because of a large number of vocal users that don't get the point of the program and expect it to do something else.

    Or how about the biggest misconception of all time "Computers are going to make your life easier and they are going to be easy to use".
    • "Computers are going to make your life easier and they are going to be easy to use"

      You forgot the "Within 10 years, everything would have been programmed and CS will be an extinct profession".
      • Re: (Score:3, Interesting)

        by suso ( 153703 ) *
        You forgot the "Within 10 years, everything would have been programmed and CS will be an extinct profession".

        Wrong. If you've been paying attention, the computer industry re-invents itself whenever a new medium comes along and all the software gets written all over again.
        • 1970s - Hey computers, lets make a spreadsheet program.
        • 1980s - Hey personal computers, lets make a spreadsheet program for home use.
        • 1990s - Hey windows, lets make a spreadsheet program that crashes.
        • 2000s - Wow, the internet, let's make a s
        • Did you noticed we were talking about big misconceptions (or wet dreams), such as working IA in our lifetime.
          CS has always been and will probably ever will be a self sustaining industry, the tools and products evolve, but the work doesn't: we are continously improving things or adding new ones on top of them.
          • by suso ( 153703 ) *
            CS has always been and will probably ever will be a self sustaining industry, the tools and products evolve, but the work doesn't: we are continously improving things or adding new ones on top of them.

            Yes of course but I was commenting on your statement that in ten years everything will have been written. Of course there are programmers that improve software during its lifetime, add new features, make it mature. But it seems like every generation of software that becomes mature, a new medium comes along a
  • Here's One More (Score:5, Insightful)

    by puppetluva ( 46903 ) on Tuesday November 27, 2007 @11:09AM (#21492185)
    The insistence to present everything as a video instead of an article or good analytical summary is holding back technology information sharing (much like this video).

    I wish these outlets would stop trying to turn the internet into TV. We left TV because it was lousy.
    • It has to be one of the most annoying trends on the web. Like another comment said, I can read faster than some video can pass on the info. They force me to sit through a x minute long video when I could have read the same info in less than a minute. News websites are the worst for this. Someone had to write it down for the talking head to read in the first place, how about publishing that? It would use less of their bandwidth too.
  • The success of the PC is that it is a quite universal tool. Changing its hardware to deal with some kind of data in a particular way is OK for some niches, but not mainstream. Who would want 1 PC to go on the web, one for word processor, one for mails...
  • by beavis88 ( 25983 ) on Tuesday November 27, 2007 @11:12AM (#21492231)
    The number one problem is all the idiots who are too stubborn/stupid to learn how to use their tools. If these people knew as little about hammers and they do about computers, there wouldn't be a round thumb left in the whole goddamn world. Just because it's a computer doesn't mean you have to turn off your brain.
    • So true, I wish I had mod points.
    • Agreed (Score:4, Informative)

      by Ultra64 ( 318705 ) on Tuesday November 27, 2007 @11:36AM (#21492593)
      "It says click OK to continue... what should I do?"

      This is the kind of question I get to deal with at work.
      • Re:Agreed (Score:4, Interesting)

        by jvkjvk ( 102057 ) on Tuesday November 27, 2007 @02:23PM (#21494811)
        Moderated as funny, but...

        Think about this in other terms. When I push the "power wash" button on my dishwasher, I can reasonably expect to know what is going to happen. When I push the "OK" button on a random dialog I only know that I have caused some action to happen. For almost all of the times where I might have to push an "OK" button I know that what I think is going to happen coincides with what actually happens (oops, excepting any, you know, bugs).

        The GP says:

        The number one problem is all the idiots who are too stubborn/stupid to learn how to use their tools. If these people knew as little about hammers and they do about computers, there wouldn't be a round thumb left in the whole goddamn world. Just because it's a computer doesn't mean you have to turn off your brain.
        If hammers were as complicated as computers, I suspect that the accident rate in their use would be staggering and there would be no round thumbs left. That and no one standing in front or behind the hammer since the heads tend to fly off in use. In fact people (with access to both) probably know more about how to use their computers than how to hammer a nail. In terms of knowledge, there is just so much more knowledge that is relevant and essential to using a computer than using a hammer.

        The advice I would give to someone sitting at an "Ok to continue" prompt varies greatly depending on what I know about what they are doing. That is, not all "OK" buttons are created equal - one could show you pr0n of Natalie Portman while another could wipe your disk of erm... pr0n of Natalie Portman. They could even be the same program!

        Now, lets try this with a hammer analogy. So you go buy this hammer because you want to put a thermometer on the tree outside (weather bug anyone?). While securing the thermometer to the tree, your house falls down into a pile of rubble. Your hammer caused it. Wha...?

        Yes, people have an obligation to use their brains when using technology, but a general purpose computer is still a complicated high tech instrument and the current generation of tools is not sufficiently advanced to resolve that complexity for the average person. If computers were as simple as hammers to use the issue would be resolved already.

        One can always blame the users for the shortcomings of computers or for the shortcomings of programmers or the UI experts. However, one is likely to have an easier time shaping the tools than the users of those tools. All well and good to call them idiots, stupid and stubborn, but they can damn well use a hammer (as well as their TV remote, car, cell phone, etc.) without issues.

        The question is how to best resolve that complexity so that it is more like a hammer from craftsman rather than from Acme as it appears now.
      • Re: (Score:3, Funny)

        by JohnBailey ( 1092697 )

        "It says click OK to continue... what should I do?" This is the kind of question I get to deal with at work.
        Hit the Any key obviously...
    • I think we need to change error messages to things that are technically accurate, with hyperlinks to wikipedia.

      Instead of Windows saying "This network has limited or no connectivity" and leaving the user to puzzle out exactly what the hell that means, it should just say "Unable to obtain an [[IP address]] from the [[DHCP]] server: operation timed out."

      Those of us who already know what that stuff means will know that they need to go futz with their router; those of us who don't might learn something (from, o
      • No way. Windows messages are already too arcane and often meaningless. The message should be very clear for a common (non-technical) user because most Windows users are not technical. Under each message should be an arrow that lets you expand to see the detailed technical explanation for those who need to know in order to resolve a problem.

        The average user would rather let their computer remain broken then learn what DHCP is (at least in my experience). A message stating "can't connect to the internet"
        • It's also useless to just about everyone, since it doesn't give any information about actually *fixing* it.
          • Maybe you didn't get to the third sentence. You should be able to see the details if needed.

            Average users don't fix anything. So why give them details that they don't understand by default? Explain things in simple terms so they get the basic idea of the problem. When they ask someone else how to fix it the other person can read the details.

            For example, the average user would understand, "No open WiFi nearby." That's enough to know basically what's going on. The techie might like to see "Authenticatio
            • Re: (Score:3, Insightful)

              by Hoi Polloi ( 522990 )
              I'm always working on old code so I constantly run into error handlers that say something like "File not found" but the info for the file name and where it is looking is available. Why not "File X not found at location Y"? (Assuming there is no security issue with giving this info of course) If the info is there pass it on and help the debugger.
      • Re: (Score:3, Funny)

        by Bob-taro ( 996889 )

        I think we need to change error messages to things that are technically accurate, with hyperlinks to wikipedia.

        Instead of Windows saying "This network has limited or no connectivity" and leaving the user to puzzle out exactly what the hell that means, it should just say "Unable to obtain an [[IP address]] from the [[DHCP]] server: operation timed out."

        (user clicks link)

        "The page cannot be displayed."

        ... this is like the old joke about the network admin only reachable by email.

      • Re: (Score:3, Interesting)

        by azrider ( 918631 )

        I think we need to change error messages to things that are technically accurate, with hyperlinks to wikipedia.

        (emphasis added) Right. The error message may change from minute to minute depending on the perception of the last editor.

        Given a choice between a static yet cryptic message and one which will change without notice (and may not even be accurate), which would you choose?

        While I know that the /. crowd prefers (as the commercials say "5 to1") wikipedia as a citation, this strikes me as "Lets make a

    • by Raul654 ( 453029 ) on Tuesday November 27, 2007 @12:15PM (#21493113) Homepage
      The hallmark of good design is that people don't have to know how it works under the hood. How many people who drive cars on a daily basis can describe the basics of what is going on in the engine? (And, I should point out - cars are much more mature technology than computers - simpler and generally better understood)

      That attitude, which is effectively equivalent to the RTFM attitude many people in the open source community take towards operating system interface design, is IMO the singular biggest obstacle to widespread linux adoption. Also (at the risk of starting an OS evangalism flamewar), it is the reason Ubuntu has become so very popular so recently. Ubuntu gets the design principles right, starting with a well-thought out package manager (admittedly copied from Debian).
      • by rbanffy ( 584143 )
        Car analogies cannot, unlike cars themselves, go too far.

        Fixing a car requires specialized tools. It makes little sense to inform the owner that "fuel injector for cylinder 3 has limited flow" rather than "take your car to a dealership as soon as possible" and let the mechanic know that a fuel injector has a problem.

        On the other hand, all you need to fix a "Unable to obtain an IP address from a DHCP server" is right there in the computer (or, at most, a phone call away). You could even have a "Unable to gai
      • Re: (Score:3, Insightful)

        by nine-times ( 778537 )

        The hallmark of good design is that people don't have to know how it works under the hood. How many people who drive cars on a daily basis can describe the basics of what is going on in the engine?

        I'd generally agree with you, but an awful lot of people just don't want to learn how to use a computer. At all. It's like if people refused to learn the difference between the gas pedal and the brake, bought manual transmissions but left it in reverse all day, didn't stop and stop signs and drove on the wrong

    • Re: (Score:3, Insightful)

      by Nerdposeur ( 910128 )

      The number one problem is all the idiots who are too stubborn/stupid to learn how to use their tools.

      While this is true in some cases, I think it's mostly snobbery. Well-designed tools can be used intuitively.

      Most people learn exactly as much as they see a need to learn. How much do you know about how your car works? Your plumbing? Your washing machine? Just the basics, I'd guess - enough to use it. Thankfully, your car's manufacturer has kept things simple for you.

      The "idiots" you refer to may have adva

      • by GrumblyStuff ( 870046 ) on Tuesday November 27, 2007 @02:42PM (#21495093)
        You don't have to know how your car works but you still have to know how to drive the damn thing.

        The problem is that no one wants to learn how to do anything. Why? Because there's always someone they can bother with the same questions over and over again.

        aka THERE'S A GOOGLE SEARCH BAR RIGHT ON THE FIREFOX BROWSER. Stop going to Google then searching!
    • Re: (Score:3, Insightful)

      by CastrTroy ( 595695 )
      You are right. I've seen many people who are smart in most situations become inexplicably dumb when sitting in front of a computer. People seem to have some thought that the computer should just do everything for them, and therefore their brain shuts off. I'm not sure if that's the exact reason, but it does seem like that is what's happening. Also I wouldn't expect to be able to walk up to a bunch of woodworking tools, and a pile of wood and be able to build a set of furniture for my bedroom, with havi
    • If these people knew as little about hammers and they do about computers, there wouldn't be a round thumb left in the whole goddamn world.

      If hammers needed constant maintenance to function normally, people would stop using hammers.

    • I don't think you're being particularly fair. A hammer is a much simpler device than a computer. While there are certainly ways to refine your skills with a hammer and a bunch of neat little tricks with one you can pick up, the basic functioning of a hammer is very straight forward. Rudimentary but useful understanding of a hammer can be taught in about a minute.

      In that same minute, I could teach a person about how to use a mouse to manipulate a cursor, and how to double click. But a minute of instruction
  • by explosivejared ( 1186049 ) <hagan@jared.gmail@com> on Tuesday November 27, 2007 @11:13AM (#21492245)
    The simple fact that most people don't have a basic understanding of even the most simple IT tasks. Most people look at a computer and see it as just a box that hums and hisses and produces magical pictures. As long as most people have a largely uneducated view of IT it won't "live up to its potential", whatever that may be. Seriously, think about how much more productive an IT worker could be without having to do the constant virus cleanup and such things which can be, for the most part, easily avoided with just a basic understanding of security. Ignorance is the biggest obstacle
    • by BeBoxer ( 14448 ) on Tuesday November 27, 2007 @11:24AM (#21492395)
      Think how much more productive an IT worker could be if the software tools didn't require them to learn a bunch of skills which are irrelevant to their job. Back when cars had chokes and manually adjusted spark advance, you would have been claiming how important it was for drivers to get 'basic understanding' of these things. But of course the real answer was to completely hide these details from drivers so that today they have no idea what it even means to choke an engine or advance a spark. Yes, ignorance is a problem. But it's not the users who are ignorant. It's those of us who develop and maintain the IT systems who are ignorantly blaming the users for our own failings.
      • Not having to worry about something because it is taken care of automatically and not caring, even at a basic level, about what is being done automatically on your behalf are really two different things. To use the tired car example (why must engineers always use the car analogy?), it is good for even the average driver to understand basically how his engine is working even if the details remain unknown. If the driver has some basic level of understanding then he will be better able to judge for himself whe
      • by CastrTroy ( 595695 ) on Tuesday November 27, 2007 @11:43AM (#21492685)
        However, having a computer that doesn't bother it's user and just takes care of itself goes against the main directive of computers. Computers are supposed to do whatever the user tells it to do. If the user instructs it to run a virus, it will run the virus. If the user instructs it to go to a phishing site, and submit their banking credentials to the server, then the computer will do that. In many instances we've set up alot of programs to ask the user when they try to do something stupid, but often they click yes, even if the computer advises against it. Maybe what we really need is AI, so that the computer will be able to tell the user "I can't let you do that , Dave", and then all our problems will be gone.
      • Mod parent up!

        My 50-something parents shouldn't have to learn about virus scans and disk defragmenting and registry maintenance in order to surf the web and send email. They have already spent their careers learning their own specialties.

        Why should our tools need babysitting all the time?

        • by rbanffy ( 584143 )
          "My 50-something parents shouldn't have to learn about virus scans and disk defragmenting and registry maintenance in order to surf the web and send email. They have already spent their careers learning their own specialties."

          That's why I set up my 70-something mother with a Macintosh.
  • by Anonymous Coward on Tuesday November 27, 2007 @11:14AM (#21492263)
    Management.
  • by lstellar ( 1047264 ) on Tuesday November 27, 2007 @11:15AM (#21492269) Homepage
    I personally believe Microsoft's dominance, and recent anti-tust troubles, has helped spur underground and indie programming. Nothing motivates youth like an evil world corporation, no? Granted they operated using a walled garden (or prison?) for many years, but you cannot tell me that a portion of the world's elite *nux programmers aren't motivated by the success of M$.

    And different forms of input? How do you release that article today- in the age of the Wii, and the smart table, etc. I think it- sans carpal tunnel- that ye ole keyboard is simply the most efficient.

    Other than that (and some other sophmoric entries like "war") this article focuses on true hinderances, in my opinion. I believe lock-out, gaps in education and copyright laws enfringe upon innovation the most. People will always have a desire to make something great, even if it is in the presence of a war, or Microsoft, etc. But people cannot innovate if it means punishment or imprisonment.
    • A lot of the Unix programmers out there are Unix programmers because the platforms that drive big business were developed and deployed on Big Iron & Unix before Microsoft was even founded. It has more to do with "Our original (INSERT ACCOUNTING/HR/ERP) package was developed for Sun/IBM/DEC back in the 1970's/1980's. Since then we've deployed newer versions on newer hardware, but it remains Unix Based." than with M$ being an evil corporation. Generally these folks are also well paid. Helps with motiva
  • Windows (Score:3, Insightful)

    by wardk ( 3037 ) on Tuesday November 27, 2007 @11:15AM (#21492279) Journal
    I suppose there are those people who will think this a troll.

    it's not, and it's the right answer.

    Windows is the single biggest stifler of progress in every IT shop I've been in. yes, there are other challenges, but those are for the most part, workable.

    you cannot work around this steaming pile of operating system. it rides on your ass all day, every day, like a yoke a slave might wear as he spends his 14 hour day rowing. every now and they the whip comes down.

    remove windows from the IT shop and watch it THRIVE

    • You're right, it sounds like a troll.

      I'm sure I speak for most IT professionals when I say when something comes along that's better for the particular job than Windows is, we'll switch eventually. This isn't religion, just practical and professional common-sense.

      Until that day, I don't think Windows is that bad to be honest. Having said that, I'd add that competition is healthy and so is diversity, but removing Windows won't achieve anything.
      • I'm sure I speak for most IT professionals when I say when something comes along that's better for the particular job than Windows is, we'll switch eventually. This isn't religion, just practical and professional common-sense.

        But don't forget that often when deciding that Windows is "the best tool for the job", the overriding factor is, "that's what all of our customers run". So the ubiquity of Windows can be a barrier to trying new things.

    • Troll or otherwise, you're just wrong. Windows is not the problem. It's the lazy and/or stupid bureaucrats you find in every IT department (top to bottom, much of the time) who admin it that are the problem. The tools they're given to work with would make no difference in their mindset and approach.
  • by Gizzmonic ( 412910 ) on Tuesday November 27, 2007 @11:17AM (#21492299) Homepage Journal
    All technological breakthroughs have happened already. The fax machine was the pinnacle of human achievement. Just give up.
  • Smarter not Faster (Score:4, Interesting)

    by downix ( 84795 ) on Tuesday November 27, 2007 @11:18AM (#21492315) Homepage
    I've said much the same as he did in regards to system speeds. If I optimize my system, I can outperform the latest and greatest my friends have. But I can optimize only so far due to the hardware design. I long back to the old Amiga days, where the core of the system was integrated around the CPU, but still giving the user a completely flexible design. Heck, you can find decades old machines running very modern hardware, due to their innovative design. Ever tried to run a modern video card, soundcard or NIC in a PC from 1989? I've seen Amigas do it. And they did it through being smarter, not faster.
  • by yagu ( 721525 ) * <yayagu@[ ]il.com ['gma' in gap]> on Tuesday November 27, 2007 @11:18AM (#21492321) Journal

    Perhaps the biggest roadblock is the general inability of the masses to grasp technology and at the same time technology's allure and ubiquity. Unlike other nuanced sciences (rocket science, brain surgery, etc), computer technology is trotted out as "easy enough for the masses".

    That "easy enough" has trickled down from the anointed few to the general population, both in the work place and in homes.

    Now, what drives decisions and directions for technology is driven more by uninformed Golf Course conversations than true understanding and needs and the ability to match technology to solutions correctly. Heck, I experienced an entire abandonment of one technology at management's whim to implement a newer and better solution. This, while the existing solution worked fine, and the new solution was unproven. (coda to that story, five years later, that team is busily re-converting the "new" back to the "old".)

    Time and again I see people doing bizarre things with technology... in the workplace, with hubris, unwilling to ask others what is most appropriate, and in the home, where ignorance, while benign in intent, rules. I don't know how many times I've encountered things like people with multiple virus checkers running on their machine because they figure more is better.

    At the same time, I remember a salesman trying to steer me away from a PC that wasn't their "hot" item because it had a video card with FOUR megabytes memory (this was a LONG time ago)... his reasoning? Who in their right mind would ever USE four megabytes memory for video??? Yeah, this salesman was senior. Yeah, I got it, he was an idiot. But these are the drivers of technology.... people not in the know.

    And, while I only have limited direct anecdotal experience of this in well-known companies, I would expect it to be more widespread than many might realize.

    • Re: (Score:2, Insightful)

      by foobsr ( 693224 )
      Perhaps the biggest roadblock is the general inability of the masses to grasp technology

      Eventually more like: "Perhaps the biggest roadblock is the general inability of humanity to navigate a complex system beyond an arbitrarily negotiated collection of local, mostly unrelated local optima".

      For short one may name it "collective stupidity".

      CC.
    • by Otter ( 3800 )
      Unlike other nuanced sciences (rocket science, brain surgery, etc), computer technology is trotted out as "easy enough for the masses".

      On the other hand, rockets and neurosurgery gear provide employment for a tiny number of really smart people, while IT creates jobs for any halfwit who knows how to find the ';' key. For all the sneering about "the masses", I don't think you guys would be happy if they really did stop using computers.

      I don't know how many times I've encountered things like people with mult

  • Bullshit (Score:4, Informative)

    by everphilski ( 877346 ) on Tuesday November 27, 2007 @11:20AM (#21492345) Journal
    There is more to computing than processor speed

    As someone who does scientific computing, I say bunk! My primary bottleneck is still the processor. FTA:

    Too much R&D time and money goes into processor speed when other issues remain under-addressed. For example, could data not be handled a bit better? What about smarter ways of tagging data? The semantic web initiative runs along these sorts of lines, so where is the hardware-based equivalent?

    Sure, tagging and controlling data is important, but far from difficult, and with well-written programs a good suite of visualization tools is relatively easy. Give me some speed, dammit! Why should I have to wait for my slot on the cluster when I could have the power right here under my desk?

     
    • Re: (Score:3, Insightful)

      by Chris Burke ( 6130 )
      Sure, tagging and controlling data is important, but far from difficult, and with well-written programs a good suite of visualization tools is relatively easy. Give me some speed, dammit! Why should I have to wait for my slot on the cluster when I could have the power right here under my desk?

      Not to mention that unless he's talking about more efficient data paths (i.e. more IPC instead of clock frequency, but still more overall execution speed), that kind of 'data tagging' is completely inappropriate for a
  • by jellomizer ( 103300 ) * on Tuesday November 27, 2007 @11:23AM (#21492377)
    Perhaps because I am a Mac user and I am kinda use to "Best of both worlds"
    (Or worst of both worlds depending on your priorities) Of WIndows and Linux. But Using all 3 OSs
    I have seen significant progress in the past 8 years. While there hasn't been to much new innovation
    per se like the killer apps that will change the world and how we think and do things. But
    society has greatly changed and technology has improved...

    Windows. Love it or Loath it. Windows has greatly improved over the past 8 years. Just with XP
    Alone. It got the population off of DOS based OS's DOS, Windows 3 - Windows ME onto the more stable
    NT Kernel. As a result major PC problems have been reduced compared to the increasing danger it
    faces. Take a 98 box and do some web browsing and see how long before it become unusable. No it is
    not perfect by any means and there is a lot of suckage to it. And Vista doesn't seem much better
    but there has been a huge stabilization on Windows even Vista is more solid then 98 or ME.

    Linux. It is no longer considered a FAD os. People now take it seriously, not just a baby Unix clone. It
    is taken seriously and used widely in the server environment. Desktop Linux never really hit full force
    mostly because of the rebirth of Apple but there were a lot of huge improvements in OS User-interface
    and it is comparable to current versions of windows.

    Internet Use. During the 90s people used the internet mostly as a fad but now it is used as part of their
    life. Just imagine doing things 10 years ago. Most things you needed to go to the store to buy. For information
    you needed to trek to the library, doing papers required huge amount of time dedicated on finding sources.
    There were a lot of things we wanted to know but we didn't because there wasn't any speedy way of looking it up.
    Finding People, getting directions, things are much different now then they use to be.

    While there hasn't been great innovation there has been great stabilization and culture change around technology
    which help to spur on the next wave of innovation in the future. We as a culture need time to lets massive changes to
    sink in so we can fully understand what the problems are with technology that need to be fixed.

    • Just with XP Alone. It got the population off of DOS based OS's DOS, Windows 3 - Windows ME onto the more stable NT Kernel.

      Actually, Windows 2000 accomplished that, XP was descended from it. But thanks for playing...

      Desktop Linux never really hit full force mostly because of the rebirth of Apple but there were a lot of huge improvements in OS User-interface and it is comparable to current versions of windows.

      I don't really think Apple deserves credit for that one. Its more likely that Linux just isn't what people are used to. The number of new computer owners is getting pretty small in comparison to the number of people who are buying new computers to replace PCs that they owned before. So naturally they are inclined to buy something familiar instead of something different.

      And of course the near-imposs

  • by SmallFurryCreature ( 593017 ) on Tuesday November 27, 2007 @11:28AM (#21492463) Journal

    Right, look at their page, filled with words that have NOTHING to do with the actuall contents but that still get noticed by search engines.

    All the big sites work like that, designed to show up at no matter what you search for. Games sites are especially bad/good at this, no matter what game you look for IGN will show up as the definitive source for info on it.

    If you want the semantic web dear ZDNet stop this crap NOW. Start it yourselve and clean up your site so that your pages are only indexed for the actual article, not all the crap around it.

    Oh but you don't wanna do that do you, because that ain't economical and will put you at a disadvantage.

    Well, that is the same reason behind all your other points. DOn't ask Intel to give up the speed race if you are unwilling to give up the keyword race.

    Semantic web? Wikipedia is my new search engine. Because wikipedia is one of the only sites to only want to return accurate results and not spam keywords like mad.

    The semantic web can't happen until you get rid of people who spam keywords. You can't make smarter PC's as long as reviewers and customers obsesss about clockspeeds.

    The first to change might win, but they will be taking a huge risk, none of the established players will do that. Remember, it took an upstart like google to change the search market, now that it is big, do you really think google would dare blacklist IGN from returning results because they got to many empty pages? Offcourse not, maybe the next search company will try that, but not google.

    Change your own site first ZDNet, then talk about how the rest of the industry should change.

    • In other news, 95% of drivers agree with the proposition that the guy in front of them should have taken the bus.
    • in that we never say what we mean.

      Try transliterating most expressions, specially curses, across linguistic barriers and you immediately see the problem.

      How is a computer supposed to 'understand' you when you can't even understand yourself without years of intimately shared experience?

      Google, with its extremely sophisticated pattern matching, is part of the solution, but they can only do so much.

      Yahoo, with its human moderated search spaces, is also part of the solution, but they can only do so much.

      Deep c
    • by Intron ( 870560 )
      Which is why marketing types are now out there editing Wikipedia pages to point to their company.
  • I see the biggest limiting factor that prevents us from experiencing computing nirvana (a la Star Trek; "computer do this..") is artificial limits placed on us by corporations trying to gouge us for more profit.

    Cell phone companies: Imagine how much more pervasive internet access would be if data access didn't cost more then a mortgage payment. I can accept a certain degree of slowness based on technical limitations.

    ISP's: Offer the moon, and then restrict your access if you try to leave the driveway. "U
  • is the skills of the people practicing IT. The root of the problem is the skills of the people hiring the people who practice IT, who prefer to hire more cheaper people than fewer good ones.
  • by LWATCDR ( 28044 ) on Tuesday November 27, 2007 @11:37AM (#21492605) Homepage Journal
    The X86, MS-DOS/Widows, and Unix/Posix.

    Yes the X86 is fast and cheap but we have it only because it ran MS-DOS and then Windows. I have to wonder just how good an ARM core made with the latest process would be? How cheap would it be at a tiny fraction of the die size of an X86. How little power would it take?
    How many of them could you put on a die the size of the latest from Intel or AMD CPU? Maybe 16 or 32?
    It will not run Windows thought...
    Take a look at the T2 from Sun.
    And then we get to Unix. Yes I use Linux everyday. I love it and I want to keep it. The problem is that I think we could do better. Linux and the other Unix and Unix like OS are eating up a huge amount of development resources.
  • A BlackBerry keyboard is a wonder of miniaturisation; shame the same's not true of most BlackBerry users.

    - the author is on drugs. BTW, I don't like BBs, but many people can't live without them and the small keyboards are their cocaine and I am pretty sure those are not Smurfs we are talking about.

    The current lack of global wars and/or disasters

    - there are plenty of wars going on at any point in time. Let's bomb the author of this POS article, maybe that will help to improve the tech.

    The author is an ass.

  • Idiot clients... (Score:2, Insightful)

    by Dracos ( 107777 )

    That are too obsessed with what they want, and ignore the developers who know what they need and how to mesh want and need together.

    The site I launched last week (prematurely, at the client's insistence) had no content, but it did have the oh-so-necessary splash page with a 5 meg flash video (with sound!) embedded in it that to the casual observer looks like a trailer for a new Batman movie. All the issues I'd brought up since the project began suddenly became important after the site went live (except th

  • ...the speed at which humans work and the Graphical User Interface.

    We are the main limiting factor in any system. Computers are theoretically designed to meet human expectations of response times. However, How much overall variation in response have we noticed between the response on a 386 running MSDOS and Windows 3.1 15 years ago and a 2 Gig Pentium running XP today? Maybe compilers run faster, but everyday tasks like word-processing or e-mail seem to run at about the same speed from a user perspective
  • ... as does the chip-makers' obsession with speed. 'There is more to computing than processor speed -- a point which can be easily proven by comparing a two-year-old PC running Linux with a new PC buckling under the weight of Vista. Shrinking the manufacturing process to enable greater speed has proven essential, but it's running out of magic ...

    Yes, there is more to a processor than raw clock speed. But the article misses a great discussion here and suggests "a better way of tagging data." WTF?

    AMD and Intel have already realized that faster clock speeds no longer equates into free performance. The newest processors have cache sizes that were unthought of four years ago. Whether consumers realize it or not, multicore superscalar desktop processors will and have become the norm. These processors have the ability to take advantage of parallelism

  • This is about the stupidest article I've read in a while.

    What "holds back tech" is the lack of talent. Plain and simple. If you want to beat Microsoft, you have to out innovate them. Yes, they have a stranglehold on the desktop, but why? Because they have an open OS that is easy to program for, and has low development costs along with quick development. .NET while far from perfect, is a pretty good building ground. Want to know why Mac gaming, or Linux gaming never took off? Ask John Carmack, who has espous
  • by ErichTheRed ( 39327 ) on Tuesday November 27, 2007 @11:51AM (#21492781)
    I know I'm going to get it for this, but here goes. One of the biggest holdbacks on technology progress is the constant churning of the tech landscape every few months. Before you think I'm crazy, hear me out. How many people work in workplaces that use Technology X where the CIO reads an airline magazine article about Technology Y? The next day, you're ripping out system X, which was actually getting stable and mature, and implementing Y just because it's new. When Y starts causing all sorts of problems, Technology Z will come along and solve everything. Software and hardware vendors love this because it keeps them in business. Most mature IT people can't stand it because they're constantly reinventing the wheel.

    There's a reason why core systems at large businesses are never changed...they work, and have had years to stabilize. Along the way, new features are added on top.

    I know the thrust of the article was "what's holding up progress in general?" Part of running a good IT organization is balancing the new and shiny with the mature and tested. Bringing in new stuff alongside the mature stuff is definitely the way to go. See what works for you, and keep stuff that works and isn't a huge pain to support.

    One other note -- a lot of technology innovation isn't really innovation. It's just repackaging old ideas. SOA and Web 2.0 is the new mainframe/centalized computing environment. Utility computing is just beefed-up timesharing distributed out on a massive scale. This is another thing that holds up progress. Vendors reinvent the same tech over and over to build "new" products.
  • So there MUST be 10, no more, no less.

    Problems with IT development:

    1. Proprietary formats: Mow much effort is lost in "Resend that as a *** file?" Or "How do I open that file?" We have some decent standards like Post Script, Latex, HTML, and OOXML. But everybody is intent on using that newfangled version of MSOffice, in which each version is intentionally incompatible with the previous.

    2. Proprietary network protocols: We still talk about MS again. This time in terms of SMB filesharing and Kerberos munging.
  • The biggest roadblock is that there are not enough people doing pure computer science research, everything else is secondary.
  • by smcdow ( 114828 ) on Tuesday November 27, 2007 @12:00PM (#21492911) Homepage
    I'd rather have a machine with slower CPU but with wide, fast busses and smart, fast I/O subsystems, then a machine with a faster CPU but with crappy I/O. Maybe I'm just wierd that way.
  • IMO, the biggest obstacle is the digital divide. The prevalining and overwhelming majority of people in the world are economically and socially dispossessed, which one can only imagine deprives the rest of us of people who would otherwise have contributed richly to IT development.
  • by erroneus ( 253617 ) on Tuesday November 27, 2007 @12:18PM (#21493153) Homepage
    This is a pretty well accepted notion and has numerous examples not of where monopolistic powers coincide with stagnation of technology, but examples of where monopolies were busted and things changed shortly thereafter. The most common example of this is when the phone service monopolies were interrupted.

    But in most (probably all) states in the US, there is a utility commission that sets the minimum standards for service offerings. Why is this? Clearly, because there is a need to mandate to companies a minimum required level of service. When the utility commissions don't mandate levels of service high enough, we end up with... well, what we see all too often, which are technological "ghettos" where service providers don't want to invest in areas that yield low return. They would rather, if it were up to them, cherry pick only the areas that would yield premium return as it would make sense. But even today, there are too many places where DSL isn't available or more commonly, where fiber service is unavailable.

    And all too often we hear about "net neutrality" because the telecoms are complaining that various applications are flooding the internet and threatening to crash it. The answer that they don't want to hear, of course, is that they should be required to scale up their hardware to handle heavier loads. They would rather restrict or impede certain types of service to reduce the bandwidth demand. (Think Comcast)

    But beyond communications, when Microsoft or any other company lacks competition, they lose incentive to apply funding to R&D, which directly affects new technologies being developed and released. Microsoft probably doesn't do much R&D. Instead, their strategy seems bent on "buying new things." This makes their R&D budget low and relies on a practice that maintains their monopoly while being parasitic against the rest of the industry. (That is to say when someone comes up with and develops a really good idea, Microsoft is likely to simply buy it... and either suppress it or put their name on it.)

    This is a rather "natural" behavior even if it is unhealthy for economies and societies hungry for growth and improvement. Note my assertion that "natural" doesn't mean healthy or good.
  • In a rut. (Score:4, Insightful)

    by ZonkerWilliam ( 953437 ) * on Tuesday November 27, 2007 @12:23PM (#21493223) Journal
    IMHO, I think IT is in a rut, just as the article eludes to. What is needed is to rethink the process. Look at providing important information to the people where they are. In other words it shouldn't matter where I am, if I sit down in front of a computer I should be able to get to my information and application's wherever I am. Information and not the computer should become ubiquitous. A RFID card system (with encryption) should allow a person to sit in a an office, or cube, and have their phone calls and desktop forwarded to the workstation their in front of.
  • Software Patents (Score:5, Insightful)

    by CustomDesigned ( 250089 ) <stuart@gathman.org> on Tuesday November 27, 2007 @12:40PM (#21493455) Homepage Journal
    ... are the biggest roadblock to IT development. No entity, not even non-commercial open source, is safe from being sued to oblivion for the crime of not only having an idea, but implementing it. The risk is still low enough, that most of us are still taking it. But it is building like an epidemic. The only defense is a policy of Mutually Assured Destruction backed by a massive portfolio of your own asinine software patents.
  • that don't fund stuff or even try to push out stuff that they have no Clue about but they read about somewhere and they want IT to use it with out asking them if it will be a good fit.
  • by stewbacca ( 1033764 ) on Tuesday November 27, 2007 @01:19PM (#21493987)

    Even the best technical process could benefit from a little humanity.
    I translate this, not as needing more women, rather as needing LESS nerds.

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...