Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

20 Factors That Will Change PCs In 2002 481

bstadil writes: "CNN's tech site has posted a list of the 20 most significant factors that will change the PC in 2002. Its not very technical but it would be interesting to get the take on this from the Slashdot community plus what they think needs to be added."
This discussion has been archived. No new comments can be posted.

20 Factors That Will Change PCs In 2002

Comments Filter:
  • My wish list (Score:5, Insightful)

    by Gothmolly ( 148874 ) on Wednesday December 26, 2001 @09:35AM (#2751507)
    Hard disks that are faster, not bigger. If I need more space, I'll add more spindles. How about giving me a disk that can push 50 or 100 MB/sec from the platters?

    Bring back those monitors-with-built-in-USB-hubs.

    Cheap SMP. I'll take my dual 550 over a single 1 GHz any day of the week. How about 8x500 MHz on the desktop, instead of 1x4GHz which is still crippled by 1 CPU hogging app?

    Less patronizing Windows UI ("My Documents", "My Computer")

    A decent NFS client for Win32.

    That's all I can think of for now. I'm not terribly interested about vapor markup languages or 1 GHz palmtops. Give me something I can use.

    dd if=/dev/coffee of=/dev/geek
  • by ZaneMcAuley ( 266747 ) on Wednesday December 26, 2001 @09:38AM (#2751513) Homepage Journal
    Subscription based Software / Services (games, streaming content etc etc)
  • What they missed (Score:5, Insightful)

    by shawkin ( 165588 ) on Wednesday December 26, 2001 @09:40AM (#2751518)
    Advanced operating systems. Defining technology as a subset of an unresponsive monopoly OS is a waste of time.

    Efficient programming tools. If four programmers could write a better Photoshop in two months and distribute it electronically, then things will change.

    Human factors driven technology. People will buy more stuff that works easily and makes them happy.
  • 802.11x (Score:5, Insightful)

    by Jaggar ( 533765 ) <eckenrode.7NO@SPAMosu.edu> on Wednesday December 26, 2001 @09:45AM (#2751530)

    I think that the largest change coming in the next few years, at least for laptop users, will be the increasing prevalence of pervasive, high bandwidth wireless networks based on the IEEE 802.11a-g protocols. I have the pleasure of working for one of the few companies that makes extensive use of these devices (we design them, actually), and I can't imagine working without them. When I go to a meeting, I just plug a card into my laptop and go. In the meeting I can bring up all of the relevant documents and data, check my email and stocks, and, most importantly, read Slashdot.

    These technologies will have an even larger impact in academic institutions. At this moment, I know of at least two universities (Carnegie Mellon and, interestingly, Akron University in Ohio) that have essentially omnipresent 802.11b wireless networks. Students with laptops can access the campus network as well as the internet from any point in the university, even the football field.

    I think that this will be the area of largest noticeable change because it is not incremental. We expect faster processors, greater storage capacity, faster busses, etc., but the ability to connect to the internet with a broadband connection from almost anywhere, that will be new and therefore more noticeable. However, even though it is novel, it is implemented with mature technologies that have been tried and tested for several years now, at least in the case of 802.11b.

  • by hiryuu ( 125210 ) on Wednesday December 26, 2001 @09:48AM (#2751536)

    Can't half tell that the non-hardware concepts got some severe business bias, can we? Gees... I don't want "Presence," that's for damned sure. If I want to be found, I make myself easy to find - so why on earth do I need to be tracked to wireless devices, PCs, cell phones, etc? And the concept of having to "pay" to avoid it? Their comparison to caller ID and the blocking of such is bogus - if I'm calling someone, that's one thing, since I initiated the contact, but, but tracking location and usage? Ick.

    And that's before the potential terrors of an electronic wallet - not that it's a bad concept, but I don't think it should get a '9' particularly when you consider that some monolith or other will be providing the service, and in a nasty, centralized fashion.

    Bah.

  • by ruvreve ( 216004 ) on Wednesday December 26, 2001 @09:55AM (#2751554) Journal
    I saw no mention of security improving. I realize that the hardest part of maintaining a secure environment is making the 'user' comply but there HAS to be a better way of protecting people from themselves. Sort of like if a burglar trips and breaks his leg in your house he can sue you.

    I mention security because of the "Presence technology" that was discussed. If somebody can get ahold of my network identity and then use that identity to pinpoint my location we could have a whole new wave of identity theft. Not that I have thought it over much but knowing exactly where somebody is opens up a whole new set of opportunites for exploit.

    White collar criminal -**- Signing Off.
  • by Kjella ( 173770 ) on Wednesday December 26, 2001 @09:57AM (#2751557) Homepage
    He's been there where M$ want him to be... never knowing he needed it until he got it right in front of him. Him and the great crowds like him is what will give M$ the IM monopoly too, because "everybody else" will be on messenger. Yet another blatant case of M$ extending their monopoly, but I don't suppose that rises any eyebrows here because it happens so often, and nowhere else either because they don't care, in particular the Justice Dep.

    Kjella
  • by ConsigliereDea ( 541188 ) on Wednesday December 26, 2001 @09:57AM (#2751559)
    My hope is that the people who were polled to come up with this list were rating the Microsoft Passport with "Impact meter: 8" as a warning, not a subtle endorsement. The Presence Technology rating of 7 scares me. I don't want people to be able to track my every move, and shouldn't have to pay for the right to be left alone. Isn't this a little to close to the conspiracy theory of the government implanting chips at birth? I have never been one to take that sort of thing seriously, but I want to know I can keep on eating and breathing technology without some hacker knowing my life.
  • Applications! (Score:3, Insightful)

    by defunc ( 238921 ) on Wednesday December 26, 2001 @10:04AM (#2751574)
    The article seems to be focused on hardware. Rather, it should have been on future applications taking advantage of these new and powerful hardware/interfaces.

    People want stuff that they can use everyday. Having a PC with software that uses voice recognition and learn my pattern usage is what I really want. I don't want to have to mess around anymore with DLLs, the registry, LD_LIRBARY_PATHs or .conf files. Applications should learn on how to adapt to my usage and fix themselves when broken. How about an instantaneous boot up people. My g4 with osx wakes up in 5 secs. Boots under 2 mins.

    The idea of HyperThreading will create a new breed of applications, both on the client and server side hopefully. The hope of having a reatime application on my desktop is very appealing. No more me waiting for the application to respond to my command!!!

  • by Bowie J. Poag ( 16898 ) on Wednesday December 26, 2001 @10:16AM (#2751603) Homepage
    400 gigs and a cloud of dust: AFC hard drives


    Not a bad idea. As the average amount of free space per PC increases, software makers will find a way to utilize it. They always have.


    PDAs move to another level: The 1-GHz palmtop


    Doubtful. Unlike cell phones, the demographic that buys palmtops aren't made up of teenagers. The people who buy and use palmtops aren't obsessed with making them smaller. They want connectivity first, then speed, then glitz. Besides, the typical uses of a palmtop don't extend to high-end computing. Having 1 Ghz under the hood isn't going to allow you to write your term paper any faster.

    Scintillating screens: Organic-light-emitting diodes


    Vastly overhyped. The intensity of OLEDs fade with time. When compared next to TFT, they look like shit, perform like shit, and go bad far quicker than TFT. They're also more expensive to produce. It'll be a novelty, but, it wont go anywhere in the end, IMHO.

    The message is the medium: Next-generation instant messaging


    Uhhh.....Ever heard of IRC? CUSeeMe? This is hardly a new technology. Its the same paradox as the video phone. Everyone thinks that videophones would be totally cool, but no one's willing to have their hair and make-up done in order to answer the phone. Pound for pound, text remains the best medium for large groups of people to share information. What good is a teleconference if only one person at a time can talk? If more than one person starts talking, you might as well be listening to a washing machine.

    Tireless wireless: 802.11 networks


    I absolutely agree. 802.11 is the beginning of something very big. Community networks, and the death knell for wire-provided technologies like DSL, Cable, 56K modems, etc.

    In search of a common language: Markup languages for everything

    Here we go again, failing to learn from history. People, its like this -- Programmers dont think alike. Thats what makes them programmers. You'll no sooner see people using the same language for markup as you'll see people coding in Smalltalk. People gravitate towards languages based on their ability to be proficient at it. No matter how good XML is, people will still use HTML becuase it suits them better, or PHP, or Perl, or C, or Assembly, or freakin Smalltalk if they want. Name a single time in history when a programmer was considered proficient in his art, WITHOUT knowing more than one language. Get my drift?

    Getting a little hyper: Hyper-threading


    Big clue for ya, gang--99.9% of your PC's lifespan is spent waiting for your lazy human ass to tell it what to do. Hyperthreading assumes that Moore's Law will flatline. It wont. What good is greater availability of processing power when you're STILL not addressing the fact that for most of your machine's usable lifespan, it's sitting idle anyway? Its like code optimization research. As time goes on, it becomes more and more irrelevant.

    And now, my short list of what WILL take off:

    802.11 and its offspring

    Corporation-controlled P2P trading

    P2P For Programmers--Wide and seamless code-sharing environments that replace segmented environments like SourceForge, Savannah, etc. Why not search for a bunch of good 3D engine s to pick from instead of just MP3s?

    GUI optimization. Out with the old, in with the new. The need for a more intuitive interface always wins in the long run, over tradition-based designs. (cough)Scrollball(cough) :)

    User-centric computing instead of application-centric computing.

    Self-regulating and self-maintaining applications...Just picture it. Your antivirus software is eventually rendered obsolute because each of your applications, independant of one another, monitors its own structure and is aware of viruses that may attempt to exploit it. Also downloads and applies new updates, code patches, etc. Maintenance-free from a user standpoint.

    Government requirements for both OS security and application security. Possibly even a ratings system.

    Where will it end! :)

    Cheers,
  • by tshoppa ( 513863 ) on Wednesday December 26, 2001 @10:23AM (#2751621)
    If markup languages such as XML will substitute proprietary binary formats like MS Word and so on, it will be very nice!

    Oh, the hard drive manufacturers will love this. A simple one-page document will take gigabytes of hard disk space :-).

    Wasn't there a slashdot story in the past year about how a common binary protocol was being replaced with XML, with a corresponding increase of a factor in the hundreds in storage/network requirements?

  • by ZigMonty ( 524212 ) <slashdot@NosPam.zigmonty.postinbox.com> on Wednesday December 26, 2001 @10:28AM (#2751638)

    Data magnet: Magnetic RAM

    What is it? Fast memory that retains data even after you've turned the power off.

    What's cool? MRAM uses magnetic charges instead of electricity to store bits; when you turn off your machine, your data remains in memory.

    This sounds a hell of a lot like magnetic core memory. It's funny that they portray magnetic RAM as something new. Yes, I know the new implementation of this will be very different (sub micron scale etc) but the idea was popular decades ago. Does anyone have a good comparison of the old way and the planned new way?

  • Cheap ADSL (Score:2, Insightful)

    by JohnHegarty ( 453016 ) on Wednesday December 26, 2001 @10:30AM (#2751649) Homepage
    I think the only thing that will shape the (home) computer world for the next few years is weather and when cheap broadband is available for the masses.
  • by Anonymous Coward on Wednesday December 26, 2001 @10:36AM (#2751665)
    The economy. This is the number one factor that will affect PCs in 2002. When the economy is shite, research firms are cut back which slows development of new technology. Manufacturers cut back which slows deployment of new technology. Consumers cut back which slows take-up of new technology.
  • by fxj ( 267709 ) on Wednesday December 26, 2001 @10:37AM (#2751668)
    As long as I can remember the battery of my notebooks all lastet ONE hour. I think thats a magic number. Obviously users dont need more than one hour and it is not as important as a faster cpu or a brighter display. The same is valid for PDAs or else they wouldnt sell so many ipaqs.
  • Not just CAD/CAM (Score:2, Insightful)

    by moogla ( 118134 ) on Wednesday December 26, 2001 @10:39AM (#2751671) Homepage Journal
    SMP does not require a special application to take advantage of, only the operating system needs to support it (Windows 2000, XP Professional and Linux all do this).

    It is useful if you like to do more than one thing at once. If you are like me and open up multiple instances of Netscape or IE, Word, MP3 players, all while burning a CD and hosting a Quake3 server, you would immediately experience the benefit from SMP.

    Any multithreaded app can gain the benefit of SMP (not to mention running many simultaneously)
  • by JanneM ( 7445 ) on Wednesday December 26, 2001 @10:43AM (#2751685) Homepage
    I'm an avid Linux user, but I think this is correct; Linux will not have penetrated the desktop far enough to be a major player in 2004. It will probably have made some pretty great strides by then (I figure both GNOME and KDE will be fully useable for newbies by 2003 - and no, they aren't today), but it will take longer than that (if ever) to become the dominant desktop.

    /Janne
  • by Junks Jerzey ( 54586 ) on Wednesday December 26, 2001 @10:49AM (#2751699)
    I was thinking over the holidays about how much I prefer playing games on a dedicated console instead of my PC. PCs have gotten to be necessary evils, especially in recent years. Consider:

    1. Upgrading one piece of software or one hardware component (e.g. video card) can easily turn into a cascade of upgrades and a week's worth of evenings. I've gotten afraid to upgrade; I don't want to mess with something that works.

    2. The rash of awful virii and worms that get released for whatever system provides the most opportunity (note: If Linux were on 95% of all desktops, there would be just as many Linux viruses; thinking otherwise is like thinking you have developed an unbreakable copy protection scheme). Keeping up with all the security patches and such has been a real headache. And unless I keep up with sites where these things are announced, I'd never know about them.

    3. There's still a general unreliability factor associated with PCs. Sometimes my PC doesn't boot completely, and I have to power down and try again. Ever run a game and hear the monitor click indicating a resolution change, and then nothing happens and even if you could kill the game you can't get your video card to reset without a reboot. This is a common occurrence in both Linux and Windows.

    4. 99% of the time there's a problem with a game or application, the response is "Do you have the latest video card drivers?" They seem to be released stealthily every few weeks. Who wants to deal with it? And whenever you upgrade there's a high probability of trouble with older software. See #1.

    If PCs change in a drastic way, I'd like to see that change in the reliability direction. Yes, yes, yes, Linux is more reliable than Windows 95/98/ME, but Windows 2000 and XP are right up there with Linux. The OS wars dodge the issue. If PCs could be make as reliable as cell phones or PDAs, then I might be interested in them again. Right now I simply view them as mainframes for your home, with all the same system administration headaches.
  • How about gzipped XML? Or a compression scheme specially designed to compress XML? Really, this isn't that big a problem. In fact, a gzipped XML Word file would probably be smaller than the binary file as the text would be compressed as well. Faster processors make this easier than ever.
  • by dasheiff ( 261577 ) on Wednesday December 26, 2001 @10:50AM (#2751704)
    What is it about voice recognition that suckers journalists in every time?

    They're writing about what they see as most important. You need to remember that reporters/journalists/comentators in the print media want desperately to be in the non-print media (radio / tv).

    I was hoping you were going for the fact that print journalists have to write a lot and since they often dictate into personal recorders to get a story and would rather not have to transcribe it later, to their computers, by hand.

  • by Wakko Warner ( 324 ) on Wednesday December 26, 2001 @10:54AM (#2751717) Homepage Journal
    Requiring copy-protection to be built in every single computer peripheral capable of storage is kinda significant, yet merits no mention. Maybe nobody's supposed to know about it?

    -A.P.
  • by pyramid termite ( 458232 ) on Wednesday December 26, 2001 @11:07AM (#2751762)
    A new archetecture. No, we're just going to keep using the IBM-PC, with its IRQs and other funky crap that was invented in the early 80s and has to be hacked around to get today's computer working at a decent speed. Eventually, someone's going to have to take the plunge and reinvent the computer. Don't hold your breath.
  • by babbage ( 61057 ) <cdevers AT cis DOT usouthal DOT edu> on Wednesday December 26, 2001 @12:32PM (#2752057) Homepage Journal
    Efficient programming tools.

    Go back and read Fred Brooks' [unc.edu] excellent book, The Mythical Man-Month [aw.com] (original copyright, 1975, 20th anniversary edition in 1995), and specifically chapter 16, "No Silver-Bullet -- Essence and Accident in Software Engineering". If you come across the 20th anniversary edition, also check out chapter 17, "No Silver Bullet" Refired, and the following chapter that discusses which of Brooks' predictions did, didn't, and were/are waiting to come to pass. Chapter 16 is captioned, succinctly,

    There is no single development, in either technology or management technique, which by itself promises even one order-of-magnitude improvement within a decade in productivity, in reliability, in simplicitly.

    Even though that was written decades ago now, it's every bit as true now as it was then. There are no programming breakthroughs on the horizon. Four programmers never will be able to write a better Photoshop in four months, because Adobe has been pouring dozens or hundreds of very smart programmers on the problem for years now, and they've had access to the very best development tools and methodologies available.

    As one very smart and very skilled Perl hacker I know mentioned recently, he *hates* Perl and he *hates* programming, not because Perl is such a bad language -- he doesn't seem to think that it is -- but that even a cleverly idiomatic, high level language like that can't do anything to make the everyday logical issues in programming go away. All it can do is, as much as possible, minimize the burden of having to juggle syntax, implementation details, and high & low level logical issues all at the same time.

    No software development breakthrough has been able to eliminate those problems. Not high level languages, not object-oriented tools & methodologies, not artificial intelligence or expert systems or graphical / icon based programming or fancy debuggers or advanced IDEs or more powerful hardware. None of it has made the essential, intractable problems go away, though most of them have made the ancilliary issues less problematic. As Brooks puts it (emphasis his):

    I believe the hard part of building software to be the specification, design, and testing of this conceptual construct, not the labor of representing it and testing the fidelity of the representation. We still make syntax errors, to be sure; but they are fuzz compares to the conceptual errors in most systems.

    If this is true, building software will always be hard. There is inherently no silver bullet.

    And that about sums it up. You might as well focus on the hardware advances, because Moore's Law is still making it proceed at an incredible clip. But software? It isn't growing any faster than any other human endeavour, which is to say, it's moving slowly and it always will. It's not the software's fault that the hardware is making it look pokey, so please don't ask any more of it [in terms of methodology or technique] than the last fifty years of experience have been demonstrate. Clearly, we're moving ahead as fast as we can, and that means slowly...

  • Re:Tools (Score:3, Insightful)

    by foobar104 ( 206452 ) on Wednesday December 26, 2001 @12:39PM (#2752084) Journal
    We're trappist monks, trapped by the bounds of syntax. The time for change is near.

    Meanwhile, tons of the image processing code in the application I'm currently working on is hand-coded in MIPS assembly. It's not old code; it's actively maintained stuff. I don't think anyone did that because they thought it'd be fun. I think they did it because it resulted in a better end-product.

    Use all the drag-and-drop GUI tools you want. I still believe the things that separate a good program from a bad program lie at opposite ends: the overall design, and the twiddly optimizations. A computer might be able to help with the stuff in the middle-- linking objects to interfaces to objects, or whatever-- but it simply can't generate those two main things for you.
  • by jurgen ( 14843 ) on Wednesday December 26, 2001 @01:04PM (#2752165)
    Here's another thing journalists (and a lot of other people) don't get: more RAM is the best way to get more out of your computer! For their "specs of your PC in 2004" they list...

    Desktop: 512MB RAM
    Laptop: 256MB RAM

    Huh? I have more than that in both today. My desktop has 1GB and my laptop 384MB.

    On the other hand they see a 4-5GHz CPU in the desktop and a 2-3GHz CPU in the laptop. Who needs that? 1-2GHz is very fast... the main reason even todays 1GHz PCs often "feel slow" to their users is that they don't have enough RAM! I hear it all the time... "my PC is slow" (brand new PC with 1GHz CPU)... turns out they only have 128MB RAM and every time they switch between their Word processor and their browser half of the other gets paged out. Duh.

    I doubt that the default laptop will go much beyond a 1GHz CPU in the next few years anyway... what we need much more now in laptops (other than RAM ;-) is lower power consumption, less heat output, etc.

    And I doubt desktops will go much beyond 2GHz soon... servers, sure, some high-end workstations, sure, but a typical home/office PC? Who needs the speed? With what we have today you can process a live video stream while silumtaneously playing Quake at 60fps (with help from dedicated video/3D hardware) which are some of the most computing resource intensive apps anyone has come up with yet.

    :j
  • by hackstraw ( 262471 ) on Wednesday December 26, 2001 @01:16PM (#2752211)
    I will not endorse the Passport for many reasons (proprietary, MS platform only, database owned by M$, etc). But I definitely do support some kind of identification on the net based upon private/public key technology preferably stored on a secure device (eg, smartcard). Can you imagine not having to remember a password ever again, or at most 1 password to unlock your smartcard.

    I cannot believe that in 2002 we are still securing our networks with username/passwords written on sticky notes stuck to monitors.

  • by daveking ( 110208 ) on Wednesday December 26, 2001 @01:37PM (#2752291)
    Computers are transforming into collections of separate networked modules.

    Most computer components are already available as networked modules: storage, audio, input, printing. Even displays with graphics processors are available as tablets and webpads. This trend will continue. Protocols and software will evolve to support it.

    Soon, processors will find their way to the market as a separate networked module, probably coupled with memory. When you add one of these modules to your network, distributed processing will let you use it in addition to all the others you already have.

    You and your family (and maybe even your neighbors) will share processing and storage resources as you use your own separate portable terminals.

    Your most important data will be encrypted on a storage module that looks more like a safe, set in concrete in the foundation of your house.
  • Desktops in 2004? (Score:3, Insightful)

    by iabervon ( 1971 ) on Wednesday December 26, 2001 @01:45PM (#2752314) Homepage Journal
    By then, we'll have the ability to connect a number of keyboard/mouse/monitor/removable-drive combinations to a single computer, and OSes will have enough stability and extra power to handle it. A family will buy a single fast computer and 2-3 heads for it, and then they'll never have to argue over it, because each head is really cheap. In fact, they'll probably get extra heads to have in different rooms, just because it's convenient.

    Once flat-panel displays are as cheap as CRTs, there's no reason to sit at a desk to use the computer; have something laptop-shaped, but attached to a machine in the closet. Everything that is expensive to make small isn't; everything that's small by default fits on your lap.

    Then people will want to ditch the cords, and they'll be out of Bluetooth range, so the heads will turn into 802.11 network appliances; LAN appliances, not internet appliances. You'll buy a computer, and it won't have a monitor or anything; those will be in the appliance. The whole thing will only cost a bit more than having a single unit, and it will be much more convenient.

    Eventually, of course, you'll be able to do things like use your home computer from a friend's house; since everything has been designed for having an 802.11 network between the user and the CPU, having the internet in between isn't much different.

    So, in 2004, my "desktop" computer won't be on a desk, and I won't be sitting at a desk to use it.
  • by Zeinfeld ( 263942 ) on Wednesday December 26, 2001 @03:00PM (#2752567) Homepage
    The description ot the PC of 2004 sounded pretty flat. A laptop with 256Mb of memory that is an inch thick and costs $2000.

    A mid range Sony Vaio can be had today with those specs for $1500, including the docking station. Admittedly the processor is 1GHz rather than 2, but batter life is the principle reason for that. And most people who have the choice today go for smaller machines that are lighter than huge brick like desktop replacements.

    What I think will happen is that the laptop phenomena will start to merge with the PDA line. Most people don't actually need or want a laptop, they want a PDA that can read email and do powerpoint presentations.

    Another thing to think about is that with 802.11b and the like it is not necessarily the case that you need a powerfull machine in your hand. We may well start to see the portable display tablet becomming detached from the desktop processor.

  • by markj02 ( 544487 ) on Wednesday December 26, 2001 @03:00PM (#2752568)
    [machines will be 100x as fast, but] Software that's capable of taking advantage of all this processing muscle is nowhere in sight.

    I find this fascinating. On the one hand, we have great programming languages, tools, and libraries whose only disadvantage compared to C, C++, Java, and C# is that they are maybe 10x slower. We have the processors to run them faster than we could run assembly a few years ago. Yet, whenever these new processors come out, everybody goes back, wastes lots of time tuning their C/C++ code and then complains that all those cycles are useless. There are still endless debates even in 2001 whether Gnome or KDE is faster. The Linux kernel developers don't even want to move to C++

    Folks, those cycles are very useful. Not for some obscure technology that you know nothing about. They are very useful to let you program faster by worrying less about fine-tuning your software and for automating lots of tasks. They are very useful also for making programs safer and more robust automatically by eliminating common bugs like buffer overflows. And they are very useful for component-based software construction, which requires some form of runtime reflection--much better done automatically.

  • by SilentChris ( 452960 ) on Wednesday December 26, 2001 @03:46PM (#2752752) Homepage
    They sorta mention it briefly with voice portals (which, personally, will die a quick dot-com death in my opinion), but I think we may really see a resurgance of voice recognition within the OS itself. MS has already started building up the Control Panel and Office for voice recognition, and I think if their XBox Voice Commander is successful (Think Mainstream) we could really start to see a push for computers that actually interface with us naturally.

    Personally, I'm hoping for a holodeck-like experience. "Computer, give me Victorian-era England. And don't skimp out on the bustiers".

  • by Kiffer ( 206134 ) on Wednesday December 26, 2001 @03:51PM (#2752776)
    In one way I agree and in an other I'm anoyed, It said "Your Pc in 2004" . not the average PC, it's fair enough for them to say that the average PC will be using MS windows 2004 (or what ever they'll be calling it) but it seems like they thought of linux and then thought "even people who use it now wont be using linux"
    there where a number of other things in that article that anoyed me ... one thing was sound ... What sort of sound cards will be around in 2004?
  • by hurst ( 221158 ) on Wednesday December 26, 2001 @06:02PM (#2753180)
    LCD Replacement ?

    Let them first replace CRT first


    Let them get colors right on LCDs before they completely replace CRTs.

    Ever try to do color work on an LCD? It sucks. Colors change depending on viewing angle, and the viewing angle difference between the center of the screen and the edge is enough to change the color and luminance signifigantly. So chances are, you have your color pallete on the edge of the screen... Pick a color and use it in the center of the screen, and it appears to be a different color! Arrgh!

    I do admit that if you're coding or staring at spreadsheets all day, you can't beat an LCD. In fact, I'm using one now, but when I have to use photoshop, and especially if it's something that I have to print, I find a good CRT-equipped computer.
  • by McDutchie ( 151611 ) on Wednesday December 26, 2001 @07:09PM (#2753344) Homepage
    Eventually, someone's going to have to take the plunge and reinvent the computer. Don't hold your breath.
    The computer has been "reinvented" many times. (Can you say Macintosh? NeXT? BeBox?) It's not the lack of innovation, but the sheep herd mentality of the consumers that cause this mess to continue.
  • by McDutchie ( 151611 ) on Wednesday December 26, 2001 @07:30PM (#2753394) Homepage
    In other words, faster processors are useful to increase bloat with impunity. Exactly how does this benefit users, hmm?

    Proposal. To make a real high-quality, say, word processor (as opposed to M$ Word bloatware that thinks it knows what you want but doesn't), all the programmers should be limited to 486's, which are in themselves more than powerful enough for the task. And that would be generous. And performance should be snappy on those, and the software should have a modern feature set. The programmers would be forced to leave out unnecessary bloat and program efficiently. The effect on the overall quality, even on fast machines, would be astounding.

    Using processor speed, component architectures, etc. as an excuse for messy and bloaty programming is degrading programming as a whole. Unix had it right - one program for one function, and that one program should do the task well.

  • by Anonymous Coward on Wednesday December 26, 2001 @09:30PM (#2753675)
    Larger harddrives.

    Moving from 80GB to 400GB is not that big a deal, but it does let us use our desktop computer to store and trade tv shows. That's going to be fun. Anyone have the first Tick episode?

    Although, this probably means that they can make a 5 GB micro drive, which would be really cool if it was hidden inside my iPac without making the outside bigger.

    The 1-GHz palmtop

    I would be much happier with larger memories in these devices. Not much that I want to do on a palmtop that needs 1 GHz of processing power. Although it would be fun to have the device do all the moon shot calculations and plot all the moon orbits for all the apollo missions in just a few seconds.

    Organic-light-emitting diodes

    Years away.

    Next-generation instant messaging

    Other countries have this infrastruct developed years ahead of where we are.

    802.11 networks

    Slow, insecure. old news.

    Markup languages for everything

    This is old news. XML won, but didn't change much.

    Hyper-threading

    Funny, I already do this on the apps I write for Linux.

    3G input/output bus

    I'll believe it when I see it. Years away.

    Peer-to-peer networking

    Lots of scalability issues to fix first.

    clear computers

    Cosmetic change only, don't see it changing anything.

    Magnetic RAM

    The old shall become new again. This is just mainframe core memory, revamped. Should be great to have 1 GB of core memory instead of flashram in my iPac though. Especially since this memory is likely to outlast flashram by years. Good for cameras and music players too.

    Presence technology

    Already works with IM.

    Fuel cells

    Gee, do you think the battery companies want you to only spend $2.00 on a gallon of alcohol that will run every current battery powered device you have for a week? No, they want you to shell out $5.00 for their batteries which will only last a couple of days at most. We will be lucky to see this technology in our lifetimes.

    Distributed computing

    We already have this in the Linux world, Beowolf and mosix are 2 of the most popular, but there are more.

    Voice portals

    Years away.

    The electronic wallet

    I think I'll keep on paying with cash or card for a while yet. Call me old fashion.

    The new cell-phone network

    Be nice to have wireless networking based on 2Mbit per second networking, that was city wide... Beats the hell out of 802.11b.

    Extreme ultraviolet lithography

    Years away.

    Multiplicity of megapixels

    Old news.

    Serial ATA storage

    It will have to beat firewire.

    Your desktop PC specs in 2004

    CPU and RAM: 4- to 5-GHz microprocessor with 512MB of DDR memory and a 600-MHz system bus

    You would be better off with an AMD 2GHz processor and 2GB of DDR RAM. And I will be running at least 2 processors on my desktop from here on out.

    Hard disk: From 300GB to 400GB on a Serial ATA bus

    An external firewire 2 drive is faster and more compatible.

    Removable storage: Rewritable DVD and -- yes -- the unsinkable 1.44MB floppy

    Floppy is Dead.

    OS on my computer will be Linux.

    Price: $1,500 to $2,000

    I won't pay more than $600 for a new computer.

    Your notebook PC specs in 2004

    CPU and RAM: 2- to 3-GHz chip with 256MB of RAM

    Better off with a battery conserving 1.5 GHz processor with 1GB of DDR RAM.

    Hard disk: 60GB to 80GB with Serial ATA interface

    Naw, these drives will be 200GB, at least.

    No need for removable drives or even CDROM drives, I want a small laptop computer. If I want to watch videos on the laptop, I'll stream it from another machine, wirelessly.

    OS, again, Linux will be my choice, although I'll have to probably pay for windows anyway when I buy a laptop. Damn that Windows tax.

    Price: $2,000 and up

    I'll not pay more than $1200 for a laptop.
  • by Erbo ( 384 ) <amygalert@gmail.COFFEEcom minus caffeine> on Thursday December 27, 2001 @03:37AM (#2754251) Homepage Journal
    In the breezy style of the CNN article...

    Big Brother Inside: The SSSCA and Digital Rights Management

    What is it? A new mandate being legislated as we speak, pushed by the record companies and movie companies (disclosure: CNN is owned by AOL Time Warner, which is also a record company and movie company, which is why they didn't say anything about this) to keep users from copying copyrighted material without "permission."
    What's cool? Depends on whether you work for a movie company or record company--if you don't, there's very little "cool" about this. The Security Systems Standards and Certification Act (to be introdued by Senator Hollings, R-SC) will mandate that all digital devices contain copyright protection systems to keep people from copying "copyrighted material." What this means is unknown as of yet, but it's for certain that the days of Napster and Gnutella wll be long gone if this comes to pass...and perhaps the days of Linux as well, since it would be impossible to put secure copyright protections into an open-source operating system. The bill also mandates penalties for tampering with digital rights management systems, and for connecting an unprotected digital device to any computer network. If you want to enjoy music or movies on your computer, the movie and record companies will tell you "It's my way or the highway"--and you'll probably have to pay. And pay. And pay. And pay. And pay.
    When's it coming? The SSSCA will likely be on Congressional committee agendas early next year. Expect its sponsors (mostly Disney) to try and get it rammed through Congress as fast as they can, with as little review as they can. Then, the "industry" has a certain amount of time to come up with the copyright protection standards that will be mandatory from then on...and if they can't come to an agreement, the government will do it for them.
    What's the catch? This will basically be The End Of The World As We Know It for the computer industry. The only beneficiaries of a law like this will be the record, movie, and other "intellectual property" companies, who will expect to see more cash flowing into their already-bloated coffers. Meanwhile, a lot of people are going to get harassed for the crime of using computer systems of their choice...and the average consumer, as always, will get screwed. Repeatedly. Forever. On the other hand, it may still be possible to stop this from happening...write your Congressional representatives and tell them why this law would be a Bad Thing for the consumer, for the computer industry, and for the American economy as a whole. Of course, bear in mind that the record companies and movie companies have more money than you do, and so they're likely to get listened to first.
    Impact Meter: 10...no, make that 10,000,000.

    This is just a poor and feeble first draft...anybody else out there, feel free to rewrite it.

    Eric

Anyone can make an omelet with eggs. The trick is to make one with none.

Working...