Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Biotech Technology Hardware

Top 10 Disappointing Technologies 682

Slatterz writes "Every once in a while, a product comes along that everyone from the executives to the analysts to even the crusty old reporters thinks will change the IT world. Sadly, they are often misguided. This article lists some of the top ten technology disappointments that failed to change the world, from the ludicrously priced Apple Lisa, to voice recognition, to Intel's ill-fated Itanium chip, and virtual reality, this article lists some of the top ten technology disappointments that failed to change the world." But wait! Don't give up too quickly on the Itanium, says the Register.
This discussion has been archived. No new comments can be posted.

Top 10 Disappointing Technologies

Comments Filter:
  • by Anonymous Coward on Sunday May 17, 2009 @07:47PM (#27989787)

    I've got some barcodes that need scanning!

    • by fuzzyfuzzyfungus ( 1223518 ) on Sunday May 17, 2009 @08:21PM (#27990003) Journal
      No, CueCat doesn't apply. You can only be disappointed by things for which you had some hope.
      • by DECS ( 891519 ) on Sunday May 17, 2009 @08:54PM (#27990211) Homepage Journal

        CueCat had a lot riding on it and lots of fairly high profile partners. Perhaps if it wasn't in the retarded shape of a big plastic cat it might have taken off.

        But what's this about the "ludicrously priced Apple Lisa"? Sure it was $10,000 in 1983, but it wasn't targeted to home users. The only other graphical computing package available at the time, the VisIon hardware/software kit from the makers of VisiCalc, the killer app spreadsheet, was less impressive and just as expensive.

        "the base VisiOn software and a mouse cost $790, each application cost between $250 and $400, and it required a $5000 hard drive upgrade on top of a $2000 PC"

        It was not hard to price a $10,000 PC in the mid-80s simply by adding a little RAM and a hard drive. The Lisa pioneered a new class of hardware at a reasonable cost compared to its newness and the competition.

        Apple's Lisa also invented the Office desktop suite, which was bundled into its price. If you wanted an integrated suite of Office software, you'd have to wait out the 80s for another seven years before Microsoft could reassemble its own Office suite for the Macintosh, and then later Windows.

        Office Wars 3 - How Microsoft Got Its Office Monopoly []

        • by Jeremy Erwin ( 2054 ) on Sunday May 17, 2009 @09:33PM (#27990463) Journal

          The Lisa could also be used for Macintosh development.

          During this time I had been designing without programming. I had a Macintosh but no development system for the Mac. In those days, the only way to develop serious Macintosh programs was on a Lisa computer. I had ordered a Lisa from Apple in May, 1984, but I did not receive the machine until August 1. So I spent the first three months of the project doing "paper design."
          Without a development system, all I could do was read the manuals, study my references, and write proposals. As it happens, this can be a good thing...If it does not go on for too long. Too many games are hacked together at the keyboard rather than designed from the ground up. In this case, however, three months of paper design was too long because during the process I needed to test some ideas on the computer before I could proceed with other aspects of the design. It was with great relief that I took delivery of my Lisa and set to work on learning the system.

          Chris Crawford BALANCE OF POWER International Politics as the Ultimate Global Game []

        • by schon ( 31600 ) on Sunday May 17, 2009 @09:50PM (#27990547)

          CueCat had a lot riding on it and lots of fairly high profile partners. Perhaps if it wasn't in the retarded shape of a big plastic cat it might have taken off.

          Perhaps if it wasn't a solution in search of a problem it might have taken off.

          There, fixed that for you.

        • Re: (Score:3, Insightful)

          by westlake ( 615356 )

          But what's this about the "ludicrously priced Apple Lisa"? Sure it was $10,000 in 1983, but it wasn't targeted to home users. Apple's Lisa also invented the Office desktop suite, which was bundled into its price.

          The original Lisa had a 5 mHz 68000 series CPU, 1 MB of RAM and two Apple FileWare 871 KB 5 1/2" floppy disk drives.

          It was not - let us say - the most responsive system Apple ever built.

          A significant impediment to third-party software on the Lisa was the fact that, when first launched, the Lisa Off

      • by Anonymous Coward
        Then it could have included Wolfram Alpha.
    • Well... Stupid as the CueCat was, I finally found use for it years latter. For the price (free), it's a workable barcode scanner with just a little bit of coding. [] [] []

      Now if I could just find a use for all those damn AOL CDs in the attic.

      • by Bowling Moses ( 591924 ) on Sunday May 17, 2009 @09:55PM (#27990581) Journal
        "Now if I could just find a use for all those damn AOL CDs in the attic."

        CD FIGHT!!! Seriously it's a lot of fun as long as nobody minds a few scratches. Back, oh god, 10 years ago a friend of mine interned at Microsoft and was on their developer network. If Microsoft made a CD for distribution anywhere in the world, any version, he got it. He had 300+ by year's end. We had about 15 guys in the dorm hucking CD's down the hall and stairwells. Everybody still had the correct number of eyes and nobody needed stitches, just a couple bandaids. And what else are you going to do with Windows 98 OSR in Swahili?
  • VR (Score:5, Interesting)

    by paganizer ( 566360 ) <> on Sunday May 17, 2009 @07:56PM (#27989835) Homepage Journal

    I honestly think if the VR headgear had been less expensive back in the 90's, VRML would have been a LOT more mainstream; I used some of the better goggles, with (IIRC) 480x480 elements, and they rocked. Bulky, uncomfortable, HEAVY, but cool & useful as hell.

    Off Topic: Can anyone tell me what I can do to get back the "you have 3 replies to your last post" info at the top of my /. page? I thought I had just been particularly un-interesting until I checked my email notifications.

    • Re:VR (Score:5, Insightful)

      by InsertWittyNameHere ( 1438813 ) on Sunday May 17, 2009 @08:08PM (#27989923)

      The current 3D MMORPGs are virtual realities.... Millions of people spend the majority of their time in these virtual worlds. Just because they don't wear bulky helmets they're disqualified?

      The article is a bit misguided on some of it's top 10 choices.

      • Re:VR (Score:5, Informative)

        by Anonymous Coward on Sunday May 17, 2009 @09:31PM (#27990445)

        I do actually think that current MMORPGs should not be considered VR.

        VR was never about creating a persistent virtual world populated by masses of real people, it's all about the sensory experience. The technology aims to replace all perception of the real world with the virtual, and make the user's interaction with the computer as close to interacting with the real world as possible. If the user is alone in the Virtual Reality or not doesn't matter, nor if there is any persistence between each session.

        "Cyberspace" is the combination of Virtual Reality with a persistent, populated Virtual World, but just because MMORPGs are approaching that concept from one direction, it does not mean that they are VR.

  • Itanium? (Score:4, Informative)

    by seeker_1us ( 1203072 ) on Sunday May 17, 2009 @08:00PM (#27989863)
    This is the first I have heard of the Tukwilla processor. With Intel not releasing a new processor in the Itanium line for such a long time, I thought they had abandoned it.
  • by east coast ( 590680 ) on Sunday May 17, 2009 @08:00PM (#27989865)
    I think that maybe this article cross that line far too much. It really should have focused on technologies of false promise (virtual reality, voice recognition, biometrics) instead of products. Some of the ideas were interesting when they limited themselves to the technology over the product. So what if the Zune fails? It's not the end of a technology.

    And for fucks sake, can we please stop beating on 10+ year old technology? I'm sick of hearing retards go on and on about Apple Lisa, Microsoft Bob and a bunch of morons who have to make a 640k joke because they don't understand anything more than that. These are the same asshats who've probably never even touched a machine with less than 128 megs of ram.
    • by An dochasac ( 591582 ) on Sunday May 17, 2009 @09:31PM (#27990439)
      I'd break it into 3 lists:

      1) Technologies which haven't yet and may never live up to their promise:

      • Fusion/Cold fusion: Is this always 40 years in the future?
      • Photovolatic power: Why hasn't this followed 'Moores law(sic)' like trends of other silicon based technology? (yeah there's a slashjoke somewhere in that sentence)
      • High temperature superconductors:Remain a lab curiosity decades after solid state lasers, bright LEDs, and other lab curiosities made it into our homes.
      • Artificial Intelligence/Expert Systems: For decades expert systems have been able to outperformed doctors on diagnosis accuracy. So why hasn't the cost of medical care gone down like every other automatable vocation? Why don't doctors use these tools?
      • Neural Networks: This and fuzzly logic were buzzwords for a while but what happened?
      • Fuel Cells: There should be a fuel cell in every home furnace, water heater and car.
      • Hybrid cars (be real, the battery capacity is anemic and the mpg on some of these hybrids is below what some of GM's Cadillacs and other diesel monstrosities of the late 1970s, erly 80s had)
      • Pebble bed fission.

      2)Good products which failed to break into the market:

      • Cars with small, efficient Diesel or rotary engines:GM and Mazda's teething pains gave these technologies a bad rap which hasn't been overcome 2 decades later (at least not in the U.S. market.)
      • Laserdisc:Randomly access each frame, skip the commercials, no copy protection, what's not to like about this 1980 technology?
      • DEC, Cray, Amiga:... This list should be much longer but it's late. Have we abandoned Josephson Junctions, Full memory crossbars, fast buses and efficient Operating systems?
      • GNU/Linux, OSX and Solaris: Three solid alternatives to Microsoft Windows, each has strength and yet none have made a significant dent in Microsoft's marketshare.

      3) Products which should have never seen the light of day.

      • Microsoft Windows, 2000, ME, Vista and that evil paperclip
      • Itanium
      • Any A/V standard blessed by the FCC, RIAA or MPAA (NTSC, HDTV, VHS, DVD, Blue Ray...): They locked us into LoFi multimedia mediocrity, consumer distrusting content management and region codes.
      • Nanotech as a buzzword. The pigment crystals in makeup and shampoo should not count as nanotechnology no matter what the marketing people think.
  • Palm (Score:5, Interesting)

    by Bios_Hakr ( 68586 ) <> on Sunday May 17, 2009 @08:01PM (#27989871)

    At one point, I could write Palm better than block letters. I remember one class where I forgot my Palm. I took notes on a piece of paper. When I got home, I noticed that I had written in Palm!

    Anyway, Palm is now a could-have-been. Lost out to Smartphones I guess...

    • Re: (Score:3, Interesting)

      It wasn't that Palm didn't do what it was meant to do for its time. The problem for Palm is that they didn't add enough enhancements over time to beat their competitors that caught up. You can read all sorts about the why namely that the Palm OS wasn't very upgradeable and Palm spent too long before deciding what to do about the future.
  • Bluetooth? (Score:5, Interesting)

    by VinylRecords ( 1292374 ) on Sunday May 17, 2009 @08:02PM (#27989877)

    It's only now that Bluetooth is getting to be useful, and only then in very limited terms. Sure, it allows people to walk around babbling into headsets, but it could have been so much more.

    Umm....the Sony PS3 and Nintendo Wii make major use of Bluetooth technology. In fact those are the only devices I own that I use Bluetooth for.

    I wouldn't say the Bluetooth being in the Dualshock 3 and Wiimote is a disappointment at all for both the creators and consumers of the technology.

    Even if Bluetooth is underperforming based on its technological potential is it really one of the 10 most disappointing technologies currently?

    • Re: (Score:3, Interesting)

      by salesgeek ( 263995 )

      Bluetooth is on the list because it's been around for years and you still can't get decent support for stereo headsets or other simple connections to work. It's been underwhelming.

      • Re:Bluetooth? (Score:4, Informative)

        by Tony Hoyle ( 11698 ) <> on Sunday May 17, 2009 @09:12PM (#27990331) Homepage

        Bluetooth is on the list because it's been around for years and you still can't get decent support for stereo headsets or other simple connections to work.

        Get a proper phone.

        This stuff has worked for *years*. Bluetoothing files between phones and PCs is a staple of a lot of people around here (I used to participate, but it gets a bit dull when you've had the 50th 'welcome to the gay hotline' ringone sent to your phone).

  • Apple Lisa?? (Score:3, Insightful)

    by Brett Buck ( 811747 ) on Sunday May 17, 2009 @08:04PM (#27989891)

    Not following them on that one, and they have the chronology completely wrong. Jobs, in particular, knew the Lisa was DOA and knew that the Mac was the way of the future for the company, and pulled people off it all the time to work on the Mac. They are right, in that the Lisa was a very nice machine (I wanted to get my father one to replace his typewriter a few years ago - he needed and wanted no more - instead he wound up with a $299 Officemax Dell shitbox that still barely functions from day to day) but I think it certainly doesn't deserve a Top 10 list. It wasn't a big enough deal to matter. I would have put the Newton on there before the Lisa.


    • by emjoi_gently ( 812227 ) on Sunday May 17, 2009 @08:29PM (#27990055)
      Well yeah, the Lisa might have been a failed PRODUCT, but it wasn't a failed technology. Whether the Mac is a parallel product or an evolved product, the point is that the idea of user friendly computer with a WYSIWYG, mouse based GUI was not a failure. This was an early unsuccessful attempt, but in the long run the problems and costs were sorted out. You are working on a machine right now, no matter what the brand of OS, that took those basic ideas and made something successful out of them. And the Newton... same thing. It's Version One of a new tech. The Newton failed, but the Palm arose out of it, and from there a whole world of handhelds and now smartphones.
  • Firewire (Score:5, Informative)

    by a whoabot ( 706122 ) on Sunday May 17, 2009 @08:06PM (#27989907)

    "Outside of a few models of high-end video cameras, FireWire isn't seen much these days."

    How about audio applications? If you want an audio interface for your laptop, you're almost always better off buying a Firewire model than a USB one; but also for many desktop applications Firewire can fit the bill over PCI/PCI-E. Plenty of the audio gear companies (M-Audio, RME, MOTU, Tascam) of course are still putting out new models using Firewire now and will continue to do so for the foreseeable future.

    • Re: Firewire (Score:4, Interesting)

      by RudeIota ( 1131331 ) on Sunday May 17, 2009 @08:56PM (#27990219) Homepage

      How about audio applications? If you want an audio interface for your laptop, you're almost always better off buying a Firewire model than a USB one; but also for many desktop applications Firewire can fit the bill over PCI/PCI-E. Plenty of the audio gear companies (M-Audio, RME, MOTU, Tascam) of course are still putting out new models using Firewire now and will continue to do so for the foreseeable future.

      I like Firewire and especially as of a few years ago, it's (finally) ubiquitously included with decent PCs/System boards and pretty much every Mac.

      However, I'm concerned about the future of it. When Apple did not include FW ports on their Macbooks several months ago, I wondered what this meant for Firewire. They also didn't include them on the Air.

      Firewire is Apple's brainchild and they've been pushing it for a decade, but what was the motivation for this? I like to think maybe it was to entice people to purchase the Macbook Pro (which still has FW800 ports) -- No, actually I don't like to think that -- but at least it isn't the other potential reason: The end of Firewire.

    • Re:Firewire (Score:5, Informative)

      by SplashMyBandit ( 1543257 ) on Sunday May 17, 2009 @08:59PM (#27990247)
      IEEE-1394b (a revision of 'FireWire') is used in the F-22 and F-35 fighters. This is because it is far superior to USB in real-time applications (isochronous modes). FireWire also uses far less CPU than USB, and has better transfer rates in practice (despite the 'theoretical' peak USB speed being faster [480 vs 400 Mbps] than 1394a). The real reason USB was invented was so that IBM and Microsoft wouldn't have to pay Apple for FireWire royalties. USB is the result of a business decision, not because it was superior technology to FireWire.
    • Re:Firewire (Score:4, Insightful)

      by v1 ( 525388 ) on Sunday May 17, 2009 @11:28PM (#27991209) Homepage Journal

      I was in full agreement with all the items they brought up until I got to firewire. You could tell the author has had little or no exposure to it. It's only major downfall if you want to call it that, is that very few windows pcs come with it by default. For the people that can use it, it's very handy for streamed raw video, high speed data transfer, and occasionally in unexpected places like networking and scanners.

      Calling USB the "firewire killer" is almost laughable. I ran some tests recently on drive IO speeds on a variety of interfaces here, including IDE, SATA, firewire 400, firewire 800, and watched firewire 400 drill USB480 into the ground on a consistent basis. Insert a hub (since USB is not chainable) and the speed gets butchered even worse. Considering that (for whatever silly reason) windows pcs don't come with it and have such a large market share, and manufacturers are still making products that use firewire as an option or the only interface, there's obviously an advantage to it over USB.

      Since there is currently no video-over-usb standard, all sorts of bad things result from a usb only camcorder. USB is not designed to be peer-to-peer, it's peer-to-host, and that severely limits its application and what works naturally with it. I don't even see why the author made a blanket comparison between the two, since mass storage is the only use they really share. Though nowadays high end scanners can use USB480 which is a good thing.

  • by mfnickster ( 182520 ) on Sunday May 17, 2009 @08:07PM (#27989913)

    "Not surprisingly, the Lisa did not sell too well and the company was sent back to the drawing board to develop the Macintosh."

    Neat way to sum it up, but not accurate. Macintosh was nearly finished while Apple was still pushing the Lisa, and Jef Raskin's original concept for the Mac pre-dated the Lisa.

    Of course, once Jobs got his mitts on it, he completely changed it from Raskin's vision, eventually provoking Raskin to quit Apple.

  • Real Top 10 (Score:5, Insightful)

    by salesgeek ( 263995 ) on Sunday May 17, 2009 @08:07PM (#27989919) Homepage

    There are much greater fails. Fails of such epic magnitude their ripples are easily confused with the tides on the ocean of technology:

    10. Floptical storage. Great stuff if you want to lose data.
    9. DIVX DVDs. The ones that you could only buy at Circuit City.
    8. VRML. Virtual reality is still around. But VRML was an abortion.
    7. CueCat. The epic fail that made Slashdot famous.
    6.iOpener. What happens when you try to sell a blade free razor using the razor blade model.
    5. The Apple Pippen. You've never seen it, it's that bad.
    4. Windows ME. Awful, bad, hideous don't describe this one.
    3. Chandler. Mitch Kapor's been a part of lots of great things, but Chandler is the PIM we'd all like to forget.
    2. MS Bob. Any top 10 tech failure list without it is not credible.
    1. Windows Vista. One would think ME would have taught Redmond a lesson.

    • Bubble Memory (Score:5, Interesting)

      by localroger ( 258128 ) on Sunday May 17, 2009 @08:34PM (#27990085) Homepage
      Back when a 16K x 1 bit RAM chip cost $40, and needed a herd of glue chips to keep it refreshed, bubble RAM was supposed to save us. It was fast, nonvolatile, and (for those early 80's days) dense. There were demo systems and ads and all kinds of hype. And then it just never sort of happened. Dynamic RAM kept getting cheaper and easier to use and the bubbles never came out at all.
  • Top so far (Score:5, Insightful)

    by gmuslera ( 3436 ) on Sunday May 17, 2009 @08:17PM (#27989975) Homepage Journal
    Artificial intelligence. We have expert systems, neural networks, etc... but an "human like" artificial intelligence? The singularity that have more odds to happen near us in the future is a black hole.

    The close second, if we include transportation are (antigrav) flying cars, of course.
  • by atheistmonk ( 1268392 ) on Sunday May 17, 2009 @08:21PM (#27990001) Homepage
    Based on what appears to be their idea of how long widespread adoption of new technology should take before it is considered a failure, I'm surprised they haven't mentioned ripped on IPv6.
  • The best line (Score:4, Insightful)

    by erroneus ( 253617 ) on Sunday May 17, 2009 @08:23PM (#27990025) Homepage

    What is wrong is expecting businesses to pay for something they don't need.

    That line can be used in many places at many times for many sides of an argument. It's my favorite argument for staying with Windows XP and Office 2003.

  • by reporter ( 666905 ) on Sunday May 17, 2009 @08:43PM (#27990129) Homepage
    Has anyone noticed that the entire desktop market is now owned by the x86 architecture? It killed SPARC, PowerPC, Precision Architecture (PA), MIPS, and Alpha. PowerPC and SPARC held out until the very end about 2 years ago. Even they were shoved out of the market.

    I literally cannot buy a non-x86 desktop or laptop even if I paid $5000.

    In the early 1970s, who could have guessed that the great-great-great-grandson of the 4004 would dominate 100% of the desktop market and a sizeable chunk of the rest of the computing market?

    • Re: (Score:3, Informative)

      on MIPS, beware China!

      Combining Linux with Wine, ReactOS and qemu is the basis of a Wintel killer.

      The platform? LUK [] on Loongson [].

      Perhaps no match for Nehalem based desktops but a challenger for the Netbook market. A platform that runs Windows applications via seamless x86-->MIPS translation. Intel and MS may struggle to match the price point, which is good for consumers because Intel with be forced to considerably beef up the performance of Atom, to compete on value. (Not to mention multi-core AR
  • Weird choice (Score:5, Insightful)

    by Tweenk ( 1274968 ) on Sunday May 17, 2009 @08:43PM (#27990133)

    They did not mention DRM? What the hell?

    Also this quote about Ubuntu:

    Maybe it was just the overenthusiastic marketing or the fanboys who swarmed to the system but Ubuntu really was supposed to change everything, where as the operating system landscape looks very much the same these days.

    It did lower the price of XP for netbooks down to a few dollars though... In a way, desktop Linux made netbooks possible - otherwise Microsoft wouldn't lower the price of their system enough for this class of machines to become viable.

  • by Xonstantine ( 947614 ) on Sunday May 17, 2009 @08:46PM (#27990159)

    Talk about the most ridiculously overhyped invention in recent memory...for a damn scooter.

  • by idiotnot ( 302133 ) <> on Sunday May 17, 2009 @08:47PM (#27990163) Homepage Journal

    Some of the products, like FireWire, are in widespread use, although maybe not for consumers. I used to work in broadcast; we had a ton of FireWire equipment where I worked.

    Itanium, similarly, has a place in certain markets. If you have an HPUX or VMS shop (like lots of government agencies), you're buying Itaniums. I know that Navy and Coast Guard have quite a few Itanium systems in production.

    As for Vista, after three years of use, I am very impressed. The only major issue I've had was with the audio/network performance present in the RTM build. Only bluescreen I've had during that time was due to a stick of RAM that'd gone bad. I can't say the same about 95, 98, NT4, 2K, or XP. And it's poor short-term memory on most people's part; XP was a steaming pile when it was released. The shop where I was working didn't start adopting XP over 2k until SP2 came out. People just have forgotten how bad it was, because after several years, it became a stable product. Vista was far better at release.

    Similarly, I've been very impressed with 2008 Server. Am in the process of implementing it throughout an enterprise, and haven't encountered any major difficulties. /UAC is annoying, though

  • Push (Score:4, Insightful)

    by ClosedSource ( 238333 ) on Sunday May 17, 2009 @08:48PM (#27990173)

    PointCast anyone?

  • by CAIMLAS ( 41445 ) on Sunday May 17, 2009 @08:49PM (#27990185) Homepage

    My friend, Duke, just read the article, and man is he pissed.

  • by nausea_malvarma ( 1544887 ) on Sunday May 17, 2009 @08:51PM (#27990193)
    I'm sick of top ten lists. Why do I care that some group at a magazine chose an arbitrary number of things in some category at their discretion with no real measurable criteria for entering the list? Get me if I'm wrong, but the whole point of a top ten list is to attract visitors to argue about what the magazine chose, and suggest things of their own that didn't make the list. It's a pseudo-event in pure form: a news story with no real news in it.
  • What about the 432? (Score:4, Interesting)

    by geekgirlandrea ( 1148779 ) <> on Sunday May 17, 2009 @08:56PM (#27990217) Homepage

    No list of tech disappointments could be complete with the Intel 432 []. Object oriented machine code and hardware-assisted garbage collection - what's not to love?

  • by jmv ( 93421 ) on Sunday May 17, 2009 @09:10PM (#27990319) Homepage

    Now, that's a more accurate title.

  • by CAIMLAS ( 41445 ) on Sunday May 17, 2009 @09:42PM (#27990503) Homepage

    Since the article is almost completely pointless (it could've been written at any point in the last decade, almost), here's my list.

    1) The Linux kernel. Yes, I use linux almost exclusively these days, but what the fuck happened to the quality since 2.6 came out? ext3 performance issues, CFQ and general i/o issues (I could do things on my 550MHz athlon w/ 256M - with respect to concurrency of tasks - that made my 1.2GHz, 512M system grind to a halt); VM priority; potential libata problems with PATA disks; breaking and shipping a new version with broken drivers (acpi) or architectures (PCMCIA/bluetooth) when it worked previously, just because the architecture was being re-written to make it 'work better'. "Leave it to the distro packagers to fix".

    2) Ubuntu. It has a lot of promise, but once you scratch the candy coating, you can see the rust underneath due to hasty product development. Part of this is due to #1, but the rest is due to simple negligence. There is absolutely no reason for basic SMB/CIFS filesharing to be fundamentally broken in a distro indefinitely; and there is no sane reason why a bug that's been fixed upstream should not be in a new distro release months after the bug has been fixed.

    3) Xorg. I remember when it forked from XFree86 and thought "good, maybe they can improve it". It's being improved, but damn is it taking a while. I imagine an alternative could've been written in the time they've taken to get this far, with the ability to run Xnest (and still have all the features of today). Why is X taking almost a gig of memory?

    4) "netbooks". I know they've only been out for a couple years now in any concrete form, and that they're "wildly" popular, but they're selling something which doesn't take advantage of what was learned 7-9 years ago when "HPC" computers were around. There were certain features which were almost a sure-thing sell: long battery life, decent display readability, touchscreen, and a usable keyboard. Current netbooks are awkward and lacking in all of these points.

    5) ARM processors/SBC/SoC as offered to the 'consumer'. This directly, somewhat, relates to #4. In the last 3-5 years, their prices have gone up - but with no substantial improvement in their specs. Yes, you can get a SoC with a 400MHz ARM CPU and 512M and host USB and SATA, but it'll cost you over $400 to do so. And really, for the cost of a 200MHz non-Intel SoC, running at ~130-250MHz with 32-64Mb, it'll still cost more than an entire Atom system (WindPC).

    6) Intel Atom. 40W power use with the Intel chipset, and (until just now, basically) you were limited to the Intel chipset. That's horribly self-defeating, making them only desirable on price.

    7) "Smartphones". If they're so damn smart, why can't I use them to their full potential? Most of them have some awesome hardware, yet we're restricted to the horrid software stacks on them (Apple included). Why no host mini-USB? I can't wait for MS to release a WinMo phone, because at least then things would (hopefully) get stirred up a bit.

    8) Anti-spam filtering. It's still a huge up-hill battle to try and deal with it, and there isn't a solution in sight.

    9) SSD storage, and rotation-free storage in general. It is not living up to expectations or promises, never mind the crystal storage methods mentioned almost a decade ago that got some really nice density.

    10) Duke Nukem Forever. Let's face it: everyone wanted to at least see if it'd be as fun as Duke3D.

  • by nausea_malvarma ( 1544887 ) on Sunday May 17, 2009 @09:53PM (#27990563) would have included the Internet, since nothing good ever came out of it. Period.
  • Bluetooth has always worked great for me. For the last 7 or 8 years I've used it to sync contact/calendar data between my Mac and whatever mobile phone I've had (I'm still an iPhone holdout). Plus I use it for file transfers between the computer and phone, and to tether to the phone to use its WWAN connection.

    And I'm a huge fan of Firewire and hate that it lost out to USB. Firewire is a lot more versatile and was designed that way from the start (comes in damned handy as a network port between two Macs sometimes, because you can run TCP/IP over it). USB was never supposed to be much more than a new connection for keyboards and mice, and now they're shoehorning other capabilities into it that it was never designed for-- which IMHO never leads to good things. This line from the article particularly annoyed me: "I know of at least three people who purchased shiny new portable video recorders and were stuffed when they realised they'd have to upgrade their systems to support FireWire." Oh, noes! They have to spend a few bucks on a PCI card! The horror!!!! Seriously? Is this a real gripe? I mean, the cheapest Firewire card at NewEgg costs $6. A really good one will only set you back $40 or so.


  • by lotho brandybuck ( 720697 ) on Sunday May 17, 2009 @10:25PM (#27990751) Homepage Journal
    For some reason, I wasted my time wallowing in the pages of schedenfreud. What I want to know is about the authors of these sorts of articles... Have they ever worked on a useful project? Sure, Lisa or the Zune didn't save the world, but what did the authors do for humanity?
  • by Animats ( 122034 ) on Monday May 18, 2009 @01:44AM (#27991845) Homepage

    Not products, technologies.

    1. RISC. RISC allowed building simple CPUs that executed one instruction per clock. But once superscalar technology was developed, with more than one instruction per clock, RISC had to keep up. RISC CPUs became as complex as CISC CPUs, and the code density was worse. In the end, RISC was a lose, except at the very low end, like Atmel microcontrollers.
    2. E-beam IC lithography. Exposing an IC with an electron beam, rather than "light" (which is now coming up on the soft X-ray end of the spectrum) has been a promising technology since the 1970s. No mask is required; just active steering of the electron beam by a computer. It works just fine. Line widths are better than what can be achieved with light and masks. It's just too slow.
    3. Solid state magnetic memory. There have been many schemes for magnetic storage without moving parts. Core memory, of course. Magnetic bubbles. Ferroelectric RAM. All work technically, but have never had much market share.
    4. Cryrogenic computing. This goes back to the early 1960s. NSA and IBM put a huge amount of effort into trying to make this work. They had gigahertz logic in the 1960s. The problem was that the gates could be made very fast, but not very small. IBM tried again with Josephson junctions. There's even a plan floating around DoD for a cyrogenic supercomputer. All this stuff works, but mainstream technology always ended up passing the technologies that ran in liquid helium.
    5. Smoke printing. This is a forgotten idea. Write a charge pattern on the paper, run it through a smoke cloud of toner-like material, then fuse the toner. It's like laser printing, but without the photoconductive drum. The problem is that the process is very sensitive to humidity, and a printing technology that requires such tight environmental controls isn't worth the trouble when there are such good alternatives.
    6. Shape-memory alloys. These were once touted as a new kind of motor, and a way to make robotic muscles. Run current through them, and they bend. The problem is that it takes a lot of current (because it's the heating that does it) and the actuators are slow.
    7. Circuit-switched packet switching. It's quite possible to have useful circuit-switched data networks. Tymnet and Telenet, in the 1970s and 1980s, worked that way, as did X.25. At one point, this looked like the future, because congestion and quality of service can be better managed in a circuit-switched system. Telcos like this kind of thing, because it leads to connection-oriented billing. But pure datagrams won out, mainly because bulk bandwidth became cheap enough that the middle of the network could run at low load factors.
    8. Wireless power transmission Not just Tesla; remember "powersats" and "rectennas"? A Japanese project once tried microwave power transmission between two islands. It worked, but wasn't efficient enough to be useful. We may see a comeback of this in the form of short-range wireless charging systems.
    9. Very Long Instruction Word machines. Each word contains multiple instructions, executed simultaneously. The Itanium is an example of this class of architecture. The problem is that the compiler has to be very, very smart to code all the concurrency into the instructions. There doesn't seem to be a performance gain over more classical architectures. This is the curse of unusual architectures; MIMD machines, dataflow machines, hypercubes, perfect shuffle machines, and similar exotic ideas have come and gone. These machines can and have been built, but are very hard to program.
    10. Wrist-mounted devices From Dick Tracy's two-way wrist radio to the HP-01, no wrist-mounted gadget with much more functionality than a watch has ever caught on. Around 1998, there was a flood of wrist pagers; that died out quickly. Even though one could cram considerable functionality into a watch-sized device today, there's little interest in doing so.

"I will make no bargains with terrorist hardware." -- Peter da Silva