Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology Hardware IT Science

The Biggest Roadblocks To Information Technology Development 280

ZDOne writes "ZDNet UK has put together a list of some of the biggest obstacles preventing information technology from achieving its true potential, in terms of development and progress. Microsoft's stranglehold on the desktop makes the list, as does the chip-makers' obsession with speed. 'There is more to computing than processor speed -- a point which can be easily proven by comparing a two-year-old PC running Linux with a new PC buckling under the weight of Vista. Shrinking the manufacturing process to enable greater speed has proven essential, but it's running out of magic ... What about smarter ways of tagging data? The semantic web initiative runs along these sorts of lines, so where is the hardware-based equivalent?'"
This discussion has been archived. No new comments can be posted.

The Biggest Roadblocks To Information Technology Development

Comments Filter:
  • Biggest roadblock? (Score:1, Insightful)

    by Anonymous Coward on Tuesday November 27, 2007 @11:06AM (#21492155)
    IT workers and their know-it-all attitudes.
  • Horrible (Score:5, Insightful)

    by moogied ( 1175879 ) on Tuesday November 27, 2007 @11:08AM (#21492175)
    The author clearly has no idea what they are saying.

    We haven't come far. Qwerty is 130 years old, and windows, icons, mice and pointers are 35. Both come from before the age of portable computing. So why are we reliant on these tired old methods for all our new form factors?

    We are reliant because they work damn good. Its not like they were the simpliest of ideas, they were just the ones taht stuck because they worked.

  • by suso ( 153703 ) * on Tuesday November 27, 2007 @11:08AM (#21492181) Journal
    I'll say it but it isn't going to do any good anyways.

    One of the big roadblocks is users not seeing the big picture or not caring. Over the years, I've seen so many programs (especially open source) get off track of their goals because of a large number of vocal users that don't get the point of the program and expect it to do something else.

    Or how about the biggest misconception of all time "Computers are going to make your life easier and they are going to be easy to use".
  • Here's One More (Score:5, Insightful)

    by puppetluva ( 46903 ) on Tuesday November 27, 2007 @11:09AM (#21492185)
    The insistence to present everything as a video instead of an article or good analytical summary is holding back technology information sharing (much like this video).

    I wish these outlets would stop trying to turn the internet into TV. We left TV because it was lousy.
  • by beavis88 ( 25983 ) on Tuesday November 27, 2007 @11:12AM (#21492231)
    The number one problem is all the idiots who are too stubborn/stupid to learn how to use their tools. If these people knew as little about hammers and they do about computers, there wouldn't be a round thumb left in the whole goddamn world. Just because it's a computer doesn't mean you have to turn off your brain.
  • by Anonymous Coward on Tuesday November 27, 2007 @11:14AM (#21492263)
    Management.
  • by lstellar ( 1047264 ) on Tuesday November 27, 2007 @11:15AM (#21492269) Homepage
    I personally believe Microsoft's dominance, and recent anti-tust troubles, has helped spur underground and indie programming. Nothing motivates youth like an evil world corporation, no? Granted they operated using a walled garden (or prison?) for many years, but you cannot tell me that a portion of the world's elite *nux programmers aren't motivated by the success of M$.

    And different forms of input? How do you release that article today- in the age of the Wii, and the smart table, etc. I think it- sans carpal tunnel- that ye ole keyboard is simply the most efficient.

    Other than that (and some other sophmoric entries like "war") this article focuses on true hinderances, in my opinion. I believe lock-out, gaps in education and copyright laws enfringe upon innovation the most. People will always have a desire to make something great, even if it is in the presence of a war, or Microsoft, etc. But people cannot innovate if it means punishment or imprisonment.
  • Windows (Score:3, Insightful)

    by wardk ( 3037 ) on Tuesday November 27, 2007 @11:15AM (#21492279) Journal
    I suppose there are those people who will think this a troll.

    it's not, and it's the right answer.

    Windows is the single biggest stifler of progress in every IT shop I've been in. yes, there are other challenges, but those are for the most part, workable.

    you cannot work around this steaming pile of operating system. it rides on your ass all day, every day, like a yoke a slave might wear as he spends his 14 hour day rowing. every now and they the whip comes down.

    remove windows from the IT shop and watch it THRIVE

  • by yagu ( 721525 ) * <{yayagu} {at} {gmail.com}> on Tuesday November 27, 2007 @11:18AM (#21492321) Journal

    Perhaps the biggest roadblock is the general inability of the masses to grasp technology and at the same time technology's allure and ubiquity. Unlike other nuanced sciences (rocket science, brain surgery, etc), computer technology is trotted out as "easy enough for the masses".

    That "easy enough" has trickled down from the anointed few to the general population, both in the work place and in homes.

    Now, what drives decisions and directions for technology is driven more by uninformed Golf Course conversations than true understanding and needs and the ability to match technology to solutions correctly. Heck, I experienced an entire abandonment of one technology at management's whim to implement a newer and better solution. This, while the existing solution worked fine, and the new solution was unproven. (coda to that story, five years later, that team is busily re-converting the "new" back to the "old".)

    Time and again I see people doing bizarre things with technology... in the workplace, with hubris, unwilling to ask others what is most appropriate, and in the home, where ignorance, while benign in intent, rules. I don't know how many times I've encountered things like people with multiple virus checkers running on their machine because they figure more is better.

    At the same time, I remember a salesman trying to steer me away from a PC that wasn't their "hot" item because it had a video card with FOUR megabytes memory (this was a LONG time ago)... his reasoning? Who in their right mind would ever USE four megabytes memory for video??? Yeah, this salesman was senior. Yeah, I got it, he was an idiot. But these are the drivers of technology.... people not in the know.

    And, while I only have limited direct anecdotal experience of this in well-known companies, I would expect it to be more widespread than many might realize.

  • by SmallFurryCreature ( 593017 ) on Tuesday November 27, 2007 @11:19AM (#21492333) Journal

    Just because something is old does NOT mean it is obsolete, more and more I see this as an absolute truth, advancing (oh okay, runaway) age has nothing to do with it.

    Some things just work and don't really need to be replaced. Change for change sake is bad. NOW GET OF MY LAWN!

  • by jellomizer ( 103300 ) * on Tuesday November 27, 2007 @11:23AM (#21492377)
    Perhaps because I am a Mac user and I am kinda use to "Best of both worlds"
    (Or worst of both worlds depending on your priorities) Of WIndows and Linux. But Using all 3 OSs
    I have seen significant progress in the past 8 years. While there hasn't been to much new innovation
    per se like the killer apps that will change the world and how we think and do things. But
    society has greatly changed and technology has improved...

    Windows. Love it or Loath it. Windows has greatly improved over the past 8 years. Just with XP
    Alone. It got the population off of DOS based OS's DOS, Windows 3 - Windows ME onto the more stable
    NT Kernel. As a result major PC problems have been reduced compared to the increasing danger it
    faces. Take a 98 box and do some web browsing and see how long before it become unusable. No it is
    not perfect by any means and there is a lot of suckage to it. And Vista doesn't seem much better
    but there has been a huge stabilization on Windows even Vista is more solid then 98 or ME.

    Linux. It is no longer considered a FAD os. People now take it seriously, not just a baby Unix clone. It
    is taken seriously and used widely in the server environment. Desktop Linux never really hit full force
    mostly because of the rebirth of Apple but there were a lot of huge improvements in OS User-interface
    and it is comparable to current versions of windows.

    Internet Use. During the 90s people used the internet mostly as a fad but now it is used as part of their
    life. Just imagine doing things 10 years ago. Most things you needed to go to the store to buy. For information
    you needed to trek to the library, doing papers required huge amount of time dedicated on finding sources.
    There were a lot of things we wanted to know but we didn't because there wasn't any speedy way of looking it up.
    Finding People, getting directions, things are much different now then they use to be.

    While there hasn't been great innovation there has been great stabilization and culture change around technology
    which help to spur on the next wave of innovation in the future. We as a culture need time to lets massive changes to
    sink in so we can fully understand what the problems are with technology that need to be fixed.

  • by SmallFurryCreature ( 593017 ) on Tuesday November 27, 2007 @11:28AM (#21492463) Journal

    Right, look at their page, filled with words that have NOTHING to do with the actuall contents but that still get noticed by search engines.

    All the big sites work like that, designed to show up at no matter what you search for. Games sites are especially bad/good at this, no matter what game you look for IGN will show up as the definitive source for info on it.

    If you want the semantic web dear ZDNet stop this crap NOW. Start it yourselve and clean up your site so that your pages are only indexed for the actual article, not all the crap around it.

    Oh but you don't wanna do that do you, because that ain't economical and will put you at a disadvantage.

    Well, that is the same reason behind all your other points. DOn't ask Intel to give up the speed race if you are unwilling to give up the keyword race.

    Semantic web? Wikipedia is my new search engine. Because wikipedia is one of the only sites to only want to return accurate results and not spam keywords like mad.

    The semantic web can't happen until you get rid of people who spam keywords. You can't make smarter PC's as long as reviewers and customers obsesss about clockspeeds.

    The first to change might win, but they will be taking a huge risk, none of the established players will do that. Remember, it took an upstart like google to change the search market, now that it is big, do you really think google would dare blacklist IGN from returning results because they got to many empty pages? Offcourse not, maybe the next search company will try that, but not google.

    Change your own site first ZDNet, then talk about how the rest of the industry should change.

  • by B5_geek ( 638928 ) on Tuesday November 27, 2007 @11:31AM (#21492505)
    I see the biggest limiting factor that prevents us from experiencing computing nirvana (a la Star Trek; "computer do this..") is artificial limits placed on us by corporations trying to gouge us for more profit.

    Cell phone companies: Imagine how much more pervasive internet access would be if data access didn't cost more then a mortgage payment. I can accept a certain degree of slowness based on technical limitations.

    ISP's: Offer the moon, and then restrict your access if you try to leave the driveway. "UNLIMITED INTERNET FOR $20/MONTH*" *If you exceed whatever usage we deem is to expensive for us, we will charge you hundreds of dollars and give you a bad credit rating.

    Media Companies & DRM: Wake up and drink the kool-aid. Your business model has changed and it all started with the VCR. People do not like being forced to jump through hoops. There are multiple options that are available that will allow you to thrive in this digital age but like buggy-whip manufacturers you refuse to adapt.

  • by LWATCDR ( 28044 ) on Tuesday November 27, 2007 @11:37AM (#21492605) Homepage Journal
    The X86, MS-DOS/Widows, and Unix/Posix.

    Yes the X86 is fast and cheap but we have it only because it ran MS-DOS and then Windows. I have to wonder just how good an ARM core made with the latest process would be? How cheap would it be at a tiny fraction of the die size of an X86. How little power would it take?
    How many of them could you put on a die the size of the latest from Intel or AMD CPU? Maybe 16 or 32?
    It will not run Windows thought...
    Take a look at the T2 from Sun.
    And then we get to Unix. Yes I use Linux everyday. I love it and I want to keep it. The problem is that I think we could do better. Linux and the other Unix and Unix like OS are eating up a huge amount of development resources.
  • Idiot clients... (Score:2, Insightful)

    by Dracos ( 107777 ) on Tuesday November 27, 2007 @11:39AM (#21492643)

    That are too obsessed with what they want, and ignore the developers who know what they need and how to mesh want and need together.

    The site I launched last week (prematurely, at the client's insistence) had no content, but it did have the oh-so-necessary splash page with a 5 meg flash video (with sound!) embedded in it that to the casual observer looks like a trailer for a new Batman movie. All the issues I'd brought up since the project began suddenly became important after the site went live (except the lack of content).

    Do people go to the dentist and demand that their fillings are candy flavored lead? No. But when that person wants a website, they demand every poison they can think of (splash page, ambush the user with audio, flash navigation that search engines can't follow, giant flash ads for themselves on every page, no content) no matter what the "doctor" recommends.

    The best clients don't assume they know the web, and will explain their business model, then ask the developers what should be done.

  • by foobsr ( 693224 ) on Tuesday November 27, 2007 @11:42AM (#21492679) Homepage Journal
    Perhaps the biggest roadblock is the general inability of the masses to grasp technology

    Eventually more like: "Perhaps the biggest roadblock is the general inability of humanity to navigate a complex system beyond an arbitrarily negotiated collection of local, mostly unrelated local optima".

    For short one may name it "collective stupidity".

    CC.
  • by ErichTheRed ( 39327 ) on Tuesday November 27, 2007 @11:51AM (#21492781)
    I know I'm going to get it for this, but here goes. One of the biggest holdbacks on technology progress is the constant churning of the tech landscape every few months. Before you think I'm crazy, hear me out. How many people work in workplaces that use Technology X where the CIO reads an airline magazine article about Technology Y? The next day, you're ripping out system X, which was actually getting stable and mature, and implementing Y just because it's new. When Y starts causing all sorts of problems, Technology Z will come along and solve everything. Software and hardware vendors love this because it keeps them in business. Most mature IT people can't stand it because they're constantly reinventing the wheel.

    There's a reason why core systems at large businesses are never changed...they work, and have had years to stabilize. Along the way, new features are added on top.

    I know the thrust of the article was "what's holding up progress in general?" Part of running a good IT organization is balancing the new and shiny with the mature and tested. Bringing in new stuff alongside the mature stuff is definitely the way to go. See what works for you, and keep stuff that works and isn't a huge pain to support.

    One other note -- a lot of technology innovation isn't really innovation. It's just repackaging old ideas. SOA and Web 2.0 is the new mainframe/centalized computing environment. Utility computing is just beefed-up timesharing distributed out on a massive scale. This is another thing that holds up progress. Vendors reinvent the same tech over and over to build "new" products.
  • by johneee ( 626549 ) on Tuesday November 27, 2007 @12:03PM (#21492951)
    Bull. (Mostly)

    Now, I'm Canadian, so I can't comment authoritatively on what it's like in the U.S, but your points make no sense whatsoever. Can it be argued that government gets in the way? Perhaps, but not with the examples you've given.

    Phones in cars: If it was just your life you were putting in danger, then who am I to stand in your way? However, this affects everyone around you. You become statistically more dangerous to everyone around you when you're talking on the phone while driving, and you should not have the right to do that. Governments who do this do it because more people are concerned about not getting run over by dorks who can't wait ten minutes to make their bowling plans than there are dorks.

    Restrictions on talking on the phone in airplanes: There were (valid?) concerns about cell phones interfering with airplane electronics. Now that these issues are more well understood, the restrictions are going away. Personally, I'd rather them be more safe than sorry.

    Electrical rate-hikes and forced conservation to combat Global Warming: Yup. Again, your right to run ten computers at artificially low rates that don't take into account the total cost of the power it takes (including the environmental cost) doesn't trump my right to not have my house under water in 50 years. You're using power, pay the full cost of it.

    Sarbanes-Oxley and other laws that make business finance riskier (so there are fewer tech startups): It has been proven over and over again that businesses cannot be trusted to monitor themselves, so the public says things like "they shouldn't be allowed to do that, someone should do something about it so my retirement fund doesn't dissapear!". Well, guess what? The "someone" tends to be the government, and the "something" is S-OX. Got a better way to make sure "they" can't do "that"? I'm all ears, but if you say the invisible hand of the market I'm going to flick your ear.

    And taxes, well, it costs money to do the business of government. I'd like it to be lower myself, but to say that internet shopping should be tax-free just because it's online is just arrogant and dumb. There may be other good reasons for it being tax-free, but if you want your iPod and you buy it online, you should be paying taxes just like the rest of us chumps. We can make a case for lowering taxes overall, but that's a completely different argument.
  • Re:Bullshit (Score:3, Insightful)

    by Chris Burke ( 6130 ) on Tuesday November 27, 2007 @12:07PM (#21493009) Homepage
    Sure, tagging and controlling data is important, but far from difficult, and with well-written programs a good suite of visualization tools is relatively easy. Give me some speed, dammit! Why should I have to wait for my slot on the cluster when I could have the power right here under my desk?

    Not to mention that unless he's talking about more efficient data paths (i.e. more IPC instead of clock frequency, but still more overall execution speed), that kind of 'data tagging' is completely inappropriate for a general purpose CPU. That kind of complexity should be added in software, with hardware merely giving it the necessary 'oomph'. As soon as you start putting high-level data storage constructs into a CPU, it becomes an ASIC -- Application Specific Integrated Circuit. Which should imply "limited usefulness and lifespan" because as soon as you want to change how you tag your data, that hardware becomes useless. Sure, after coming up with a good software-based data storage scheme, if you calculate that the performance of the scheme is worth the large cost, then create an ASIC for it. But to admonish the CPU makers in general for not creating such a thing? That's just backwards.
  • by Raul654 ( 453029 ) on Tuesday November 27, 2007 @12:15PM (#21493113) Homepage
    The hallmark of good design is that people don't have to know how it works under the hood. How many people who drive cars on a daily basis can describe the basics of what is going on in the engine? (And, I should point out - cars are much more mature technology than computers - simpler and generally better understood)

    That attitude, which is effectively equivalent to the RTFM attitude many people in the open source community take towards operating system interface design, is IMO the singular biggest obstacle to widespread linux adoption. Also (at the risk of starting an OS evangalism flamewar), it is the reason Ubuntu has become so very popular so recently. Ubuntu gets the design principles right, starting with a well-thought out package manager (admittedly copied from Debian).
  • by Nerdposeur ( 910128 ) on Tuesday November 27, 2007 @12:17PM (#21493147) Journal

    The number one problem is all the idiots who are too stubborn/stupid to learn how to use their tools.

    While this is true in some cases, I think it's mostly snobbery. Well-designed tools can be used intuitively.

    Most people learn exactly as much as they see a need to learn. How much do you know about how your car works? Your plumbing? Your washing machine? Just the basics, I'd guess - enough to use it. Thankfully, your car's manufacturer has kept things simple for you.

    The "idiots" you refer to may have advanced degrees in their field; they just don't happen to be IT people. Don't expect them to waste their time learning everything you know. If you need a lawyer, you'll hire one; if a lawyer needs an IT person, he'll hire one. But in ordinary circumstances neither the law nor technology should intrude in your normal activities.

  • In a rut. (Score:4, Insightful)

    by ZonkerWilliam ( 953437 ) * on Tuesday November 27, 2007 @12:23PM (#21493223) Journal
    IMHO, I think IT is in a rut, just as the article eludes to. What is needed is to rethink the process. Look at providing important information to the people where they are. In other words it shouldn't matter where I am, if I sit down in front of a computer I should be able to get to my information and application's wherever I am. Information and not the computer should become ubiquitous. A RFID card system (with encryption) should allow a person to sit in a an office, or cube, and have their phone calls and desktop forwarded to the workstation their in front of.
  • by CastrTroy ( 595695 ) on Tuesday November 27, 2007 @12:24PM (#21493229)
    You are right. I've seen many people who are smart in most situations become inexplicably dumb when sitting in front of a computer. People seem to have some thought that the computer should just do everything for them, and therefore their brain shuts off. I'm not sure if that's the exact reason, but it does seem like that is what's happening. Also I wouldn't expect to be able to walk up to a bunch of woodworking tools, and a pile of wood and be able to build a set of furniture for my bedroom, with having to learn anything. I don't know why people have this attitude computers, where they should be able to use it without any knowledge.
  • Software Patents (Score:5, Insightful)

    by CustomDesigned ( 250089 ) <stuart@gathman.org> on Tuesday November 27, 2007 @12:40PM (#21493455) Homepage Journal
    ... are the biggest roadblock to IT development. No entity, not even non-commercial open source, is safe from being sued to oblivion for the crime of not only having an idea, but implementing it. The risk is still low enough, that most of us are still taking it. But it is building like an epidemic. The only defense is a policy of Mutually Assured Destruction backed by a massive portfolio of your own asinine software patents.
  • by 644bd346996 ( 1012333 ) on Tuesday November 27, 2007 @12:51PM (#21493603)
    That's a good point that too many people in the computer industry have yet to grasp, but there are some old, simple technologies that are really past their prime and survive on inertia alone. The example given above of a mouse and cursor is a pretty good one. I'm quite sure that, given a well designed user interface, I could be far more productive with a multi-touch screen as a pointing device than with a mouse. The problem is that that would completely change the ergonomics of computer workstations and user interfaces (ie. the screen would have to be closer to horizontal than vertical, and buttons would have to be bigger and more round on average.) Those factors have done a pretty good job of keeping tablets off the desktops of non-artists.
  • by Hoi Polloi ( 522990 ) on Tuesday November 27, 2007 @01:55PM (#21494435) Journal
    I'm always working on old code so I constantly run into error handlers that say something like "File not found" but the info for the file name and where it is looking is available. Why not "File X not found at location Y"? (Assuming there is no security issue with giving this info of course) If the info is there pass it on and help the debugger.
  • by nine-times ( 778537 ) <nine.times@gmail.com> on Tuesday November 27, 2007 @02:30PM (#21494907) Homepage

    The hallmark of good design is that people don't have to know how it works under the hood. How many people who drive cars on a daily basis can describe the basics of what is going on in the engine?

    I'd generally agree with you, but an awful lot of people just don't want to learn how to use a computer. At all. It's like if people refused to learn the difference between the gas pedal and the brake, bought manual transmissions but left it in reverse all day, didn't stop and stop signs and drove on the wrong side of the road. And then when you suggest that those people took driving lessons, they exclaimed, "But why can't someone just make driving easier?!"

    People shouldn't have to know about the internals of how a computer works, but I wish they were willing to learn how to operate them.

  • by Samrobb ( 12731 ) on Tuesday November 27, 2007 @02:37PM (#21495009) Journal

    I'll agree with you that most of the original poster's points don't really make his case. However, I still think his main premise - that government is the biggest roadblock to IT development - stands, but for other reasons:

    • Copyright "innovations"
    • DRM regulations (DMCA)
    • Software patents (and patent trolls)
    • Business model patents (more trolls!)

    You can come up with your own list, I'm sure. There's a cost of doing business that is directly related to government regulation, which is fine and acceptable - if the government says that you need to inspect your product before it ships or follow a prescribed process to produce it, then that's a direct cost. You can figure it into your business plan, allocated resources to meet the requirements, and so on.

    There is also a cost of doing business that is indirectly related to government regulation. This is caused by overly vague, inefficient, and misapplied laws that have made the exploitation of the legal system a business model in and of itself. There is no way to say "At this point, we have complied with all the regulations, and we're in the clear" - everything needs to be taken to court and decided in front of a jury. The best you can do, even if you haven't broken any laws, is hope that you never run into someone with a grudge and more money than you. That is a business killer.

    (To further make my point - while I was writing this, I got a notice that a company has filed a patent infringement lawsuit [blogs.com] against Nicholas Negroponte and the OLPC project... over "illegal reverse engineering of its keyboard driver source codes". Does the case have merit? Who knows? Until the judge rules - or the suing company suggests a modest out-of-court settlement - it's like the Magic Eight Ball says: "Future hazy, try again later".)

  • by GrumblyStuff ( 870046 ) on Tuesday November 27, 2007 @02:42PM (#21495093)
    You don't have to know how your car works but you still have to know how to drive the damn thing.

    The problem is that no one wants to learn how to do anything. Why? Because there's always someone they can bother with the same questions over and over again.

    aka THERE'S A GOOGLE SEARCH BAR RIGHT ON THE FIREFOX BROWSER. Stop going to Google then searching!
  • by sumdumass ( 711423 ) on Tuesday November 27, 2007 @03:06PM (#21495399) Journal
    I don't think you thought that threw enough.

    Using your arms and fingers to point in a screen is already a reality. I have had touch screens on monitors for a while and you don't realize how much energy you end up exerting in something as simple as playing a game of solitaire. If you had to do your entire computing like this, you would be wanting the mouse back really fast. If your mouse is set up right, you shouldn't even have to pick your wrist up to move the pointer anywhere on the screen. it is loads more efficient then using your fingers on a touch screen or something.

    You wouldn't mind it for occasional tasks, but every day; and productive work or play, you will end up disliking the decision. I use the touch screens for Kiosks sitting in a lobby that show of 3d tours of cabins that a company I administrate for rents customers can also check online webmail from them. I don't know how many customers have asked for a mouse after using the systems for a while. We ended up putting a wireless one on and sticking it in a drawer and only bring them out when asked for.
  • Re:Here's One More (Score:2, Insightful)

    by theanorak ( 533531 ) on Tuesday November 27, 2007 @04:25PM (#21496525) Homepage Journal
    I think it's because, no matter how much people wear their "nerd cred" on their sleeve, and how super-duper-smart they might genuinely be, they're all still people - and so lots (most?) want to be famous, a celebrity. No matter how you slice it, I think that still means "being on TV", even if the TV is actually a video podcast or whatever. People still want to be looked at, as well as listened to. I can't help but agree that we're not necessarily making the best use of video on the web. There are a whole bunch of things where the easy availability of reasonably high-quality* video makes a *massive* difference. How much better is a concert review with a clip of the performance? How much better is a video game clip than a screenshot? There are great uses for video on the web, loads of 'em. But an awful lot of video podcasts and "interview" materials aren't necessarily it.

Anyone can make an omelet with eggs. The trick is to make one with none.

Working...