Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Windows Operating Systems Graphics Software

Windows Longhorn to make Graphics Cards more Important 714

Renegade334 writes "The Inquirer has a story about MS Longhorn and its need for better than entry level graphics cards. This is due to the WGF (Windows Graphics Foundation) which will merge 2D and 3D graphics operations in one, and 3D menus and interfaces that require atleast Shader 2.0 compliant cards. Supposedly it will really affect the performance of the new Microsoft OS." This has been noted before in the system requirements for Longhorn, but it would seem the full impact is slowly being realized.
This discussion has been archived. No new comments can be posted.

Windows Longhorn to make Graphics Cards more Important

Comments Filter:
  • How silly (Score:5, Interesting)

    by grub ( 11606 ) <slashdot@grub.net> on Thursday January 13, 2005 @10:19PM (#11355546) Homepage Journal

    This is due to the WGF (Windows Graphics Foundation) which will merge 2D and 3D graphics operations in one, and 3D menus and interfaces that require atleast Shader 2.0 compliant cards.

    That's just plain stupid. Grandpa & Grandma want to check their email and pics of the grandkids, why on earth should they require a Radeon MegaXP293823-XtremeSLI+ to do that? I hope there's an option to disable all that cycle-wasting crud or MS may be shooting itself in the foot: how many offices will spend a few hundred dollars on individual video cards just to upgrade the OS? What about those machines with onboard video (ala Dell?)
  • Funny (Score:1, Interesting)

    by spac3manspiff ( 839454 ) <spac3manspiff@gmail.com> on Thursday January 13, 2005 @10:20PM (#11355555) Journal
    "Windows Longhorn to make graphics card important"
    Translation: Microsoft inserts knife in back.

    People are not going to buy longhorn if they need to buy a graphics card too. They will look for something else and find linux!
  • Cool (Score:2, Interesting)

    by the_Bionic_lemming ( 446569 ) on Thursday January 13, 2005 @10:21PM (#11355567)
    Finally a move into using hardware to speed stuff up.

    I know we'll see a bunch of folks protesting bloat and other fud - but it'll be cool to see what they come up with with a home UI that strains a vid card.
  • Great, but. (Score:5, Interesting)

    by PenchantToLurk ( 694161 ) on Thursday January 13, 2005 @10:27PM (#11355629)
    I've used Windows since 3.0. I'm a Windows (.Net) developer. And I agree that the gee-whiz factor will be great. Animations, depth to menus... it'll be gorgeous.

    But... It doesn't matter how fast computers get, Windows Explorer Shell always seems to become less snappy, even on fresh installs. XP made the start menu slower than ever as it retrieves nonessential metadata on the shortcuts. Myriad Shell extensions, over time, bring the Explorer UI to a crawl.

    Sexy is great, but I have to use it every day. It's just not worth making the UI dog even worse.
  • Prices (Score:3, Interesting)

    by Sophrosyne ( 630428 ) on Thursday January 13, 2005 @10:30PM (#11355665) Homepage
    Watch those PC prices go up for a little bit... then potentially drop- but ATI and Nvidia would be smart to cash in on this-- maybe bundle Longhorn with video-cards and extra ram.
  • by Knight2K ( 102749 ) on Thursday January 13, 2005 @10:33PM (#11355704) Homepage
    My first thought was: "Gee how original! Hadn't heard of a good idea like that since.... Mac OS X maybe."

    I'd be surprised if they really went wild with 3d interfaces like the 'Jurassic Park' file browser, or the cube with web pages mapped on it that was posted here awhile ago. I think they are just going to do what Apple has already done and what Keith Packard is working on for X-Windows.

    You are probably right. Microsoft will only use it for flashy effects. At least Apple eventually got to arguably useful things like Expose. But they had to put Quartz Extreme in place first before they could do it.

    I'm not inclined to be charitable, but hopefully this is Microsoft laying the groundwork for interesting and useful user interface ideas.

    Naaaah, that is too nice.
  • by Dominic_Mazzoni ( 125164 ) on Thursday January 13, 2005 @10:34PM (#11355707) Homepage
    Is this going to be another case of where Microsoft tries to copy Apple, but misses the point?

    Mac OS X 10.2 introduced "Quartz Extreme", which uses your graphics card to composite your screen. This meant that dragging windows around now required almost no CPU power at all. In 10.3, they introduced several 3-D effects to enhance the interface - most notably a rotating cube when you switch users.

    There are two key points that Microsoft seems to be missing, though:

    * Mac OS X looks exactly the same if you don't have a powerful enough graphics card, and screen redrawing is not too slow. Having a graphics card just makes the system more responsive because the CPU is doing less of the work.

    * The system degrades gracefully - if you don't have a powerful enough graphics card or run out of video RAM, certain 3D transitions may be skipped. But everything will still function, and everything will look the same.

    It's too early to tell, but it is starting to sound like Microsoft may be creating a new interface that requires a super graphics card, leaving those with only cheap integrated video with a completely different interface. To me that sounds like a recipe for tech support hell - novice users won't understand why their screen doesn't look like someone else's.
  • Re:How silly (Score:5, Interesting)

    by Tasy ( 410890 ) on Thursday January 13, 2005 @10:34PM (#11355712)
    I think something most people don't realize is that by using the GPU to render, you are actually taking load OFF of the CPU, not adding to it. Bravo to Microsoft for this.

    Now all we have to do, is pray they don't leave some loop hole open that lets someone burn your video card. Can you imagine, built in Windows overclocking?

    *shudder*
  • call me crazy... (Score:2, Interesting)

    by dAzED1 ( 33635 ) on Thursday January 13, 2005 @10:35PM (#11355716) Journal
    but who says grandpa and grandma need to move to longhorn as soon as it comes out, when MS is just nowending support for WINNT4.0, as reported recently here on /.?

    Grandpa and grandma will be just fine on 2000 or xp, or...and here's the crazy part...even 98. My father in law still uses win3.freaking-1 on a 486, for Christ's sake. Grandpa and grandma will be just fine.

  • by KalvinB ( 205500 ) on Thursday January 13, 2005 @10:41PM (#11355765) Homepage
    There's been a slump in the computer sector due to the massive roll out around 2000. Not too many people buy a new computer within a couple years. It wouldn't surprise me if most people were still using the systems they bought 4 years ago. If they're using XP, it's a software upgrade only.

    When XP came out my dad, a programmer for a large corporation, eventually bought a new computer from Dell with XP on it about a year ago. His previous system was a 350Mhz Dell. A programmer myself, my top system is a 1.2Ghz Duron running Win2K. I've had it for a couple years.

    When Longhorn comes out it's time for an upgrade anyway and most people are going to buy prebuilt systems. Those prebuilt systems will have a (barely) sufficient graphics card.

    GeForce FX 5500's are well under $100 already. In a couple years when Windows needs that kind of card to run, they'll be dirt cheap and onboard.

    And it'll be just in time for when people are looking to upgrade their computer hardware anyway.

    Complaining that MS is forcing upgrades is as silly as claiming ID Software forces hardware upgrades. I still use 2000, could use 98 if I wanted. I could also play Wolfenstein 3D and stick to a 386. Something needs to drive the market. If there was no need for better hardware, there'd be no better hardware. It's all artificially driven anyway. There's no objective reason why we need fancy pants graphics in any software. There's no objective reason we need high quality, drive space/CPU/Memory eating, audio/video.

    In short, who cares that MS is making greater graphics demands for it's OS? They've done this with every release. Even Linux is making greater and greater demands. If you want the all the graphics pizzaz of Windows 3.11, use Windows 3.11. Some of us like an OS that looks "pretty."

    If you want a plain text OS, then use DOS or ditch the GUI of Linux and have fun.

  • Well... (Score:5, Interesting)

    by i0wnzj005uck4 ( 603384 ) on Thursday January 13, 2005 @10:46PM (#11355816) Homepage
    I'd say that 3D acceleration is a Good Thing. After using QuartzExtreme on multiple macs, I have to say it makes a massive difference in most apps. It *does* speed up even moderately easy 2D things, like word processing apps. Also, where you notice the most difference is when switching between programs. Basically you've already got the images loaded in video ram, so a lot of stuff is instantaneous. And yeah, iChat AV wouldn't be quite as pretty on Win XP.

    But the real question is: why are pixel shaders needed? Unless you're doing strange reflections or simulating bumps or playing around with reflectivity in realtime, I can't imagine a use for them. I certainly can't see why you'd need anything more than simple textured quads or triangles. Oh, and some sort of alpha support for shadows. All of that sounds like a TNT2-era card, like the one I used to use to do Quake II.

    What this really feels like is Microsoft pushing hardware adoption again. Ever notice how new motherboards don't come with USB drivers for Windows XP? How you have to upgrade to the latest service pack to get USB support? Partly piracy curbing, and partly I think to keep a hold by forcing people to use approved hardware.
  • by Anonymous Coward on Thursday January 13, 2005 @10:50PM (#11355847)
    The funny thing is I'm using KDE now, and I'm using an opengl window decoration (crystal-gl, iirc). It's subtle, it just makes the window bar look like it's made of tinted, curved glass, refracting the desktop background. And on my crappy Athlon 750 with a Radeon 9000, it runs just as fast as any of the other normal window decorations because my otherwise unused video card is doing all the heavy lifting.
  • Re:Funny (Score:4, Interesting)

    by tomstdenis ( 446163 ) <tomstdenis@gma[ ]com ['il.' in gap]> on Thursday January 13, 2005 @10:53PM (#11355868) Homepage
    No the argument is the wording is misleading.

    Just as /. types bitch that "MPAA cracking down on torrents" makes BitTorrent look bad so does saying "hardware X doesn't work in Linux [when comparing to Windows]" make Linux sound bad.

    The reason why most wifi hardware doesn't work in Linux isn't a lack of trying. It's that hardware manufacturers GO OUT OF THEIR WAY to not support Linux.

    For example, my friend got a "v4 Linksys" 802.11b card [iirc it was Linksys....] and found out that only the v3 card works in Linux.

    Similarly the "SoundMAX" cmpci asus chipset [at least when first introduced] was purposefully different from the original cmpci chipset [and didn't work at least in the 2.4 kernels].

    So it's not that Linux developers don't develop drivers [or try to] it's that hardware developers change specs and don't document things.

    In the future just say "Linksys doesn't support their customers [*]" instead of saying "Linux doesn't support Linksys".

    [*] Any BS about not being enough Linux users is just stupid. The benefit from taking the time to write competent Linux drivers [or just release the specs] would far outweigh the cost of doing so.

    Tom
  • by fyngyrz ( 762201 ) on Thursday January 13, 2005 @11:07PM (#11355974) Homepage Journal
    Making use of the available graphics power just makes sense, and Apple was smart to be the first to realize this.

    Mmm, no. Commodore was the first to really do this. The original Amiga had native graphics capabilties that still aren't available (like multiple resolutions onscreen) in PC hardware. The OS used them, and used them well. When a more advanced Amiga came with more graphics capabilities, the OS automatically configured them and used them as well. Apple was me too, much later. :)

    But that's OK. Apple knows how to market -- that more than makes up for coming expensive, late and/or weakly with a number of things. Plus they provide a really nice end user experience.

  • by atlasheavy ( 169115 ) on Thursday January 13, 2005 @11:20PM (#11356075) Homepage
    Yes, XP ran a hell of a lot better than Panther. The systems in comparison, btw, were a 400MHz Celeron with 128MB ram and a shitty ATI 8MB video card, and an iBook G3 500 with 128MB ram and whatever shitty video card was in that thing (I think it actually was a 16MB ATI). And specifically, the problems I would run into were directly tied to Quartz rendering.
  • Re:How silly (Score:3, Interesting)

    by nuggetman ( 242645 ) on Friday January 14, 2005 @12:00AM (#11356686) Homepage
    Having any themes on in Windows XP on my Athlon XP 1900+, 512 Megs ram, and a Radeon AIW 7500 nearly kills it just scrolling down in the start menu (considering ALL sound and all other windows freeze, I say its close enough to killing it).


    You, my friend, have some other problem with your system. Or you're flat out trolling. I use themes on XP on a 667MHz P3 w/ 384 megs of RAM with absolutely no trouble.
  • by yorkpaddy ( 830859 ) on Friday January 14, 2005 @12:24AM (#11357026)
    This is a broken window fallacy. You say that the OS requiring a 3d graphics card will cause people to buy more 3d graphics cards and expensive computers, you say, "aha, more money being spent, that is good for the economy". Not necesarily. The money on 3d graphics cards has to be spent to get your computer what it did well without 3d graphics cards (draw a gui). Unless the new UI adds a lot to the experience we have no net gain, we have just spent money to get back to where we originally were (a "usable" GUI).

    Wikipedia: Broken Window Fallacy [wikipedia.org]
  • by pipingguy ( 566974 ) on Friday January 14, 2005 @01:10AM (#11357620)

    Hey, I *like* drop shadows and semi-transparency on menus and the like, it provides a "rich" environment and also helps to prioritize open windows. Perhaps you are a command line guru, I work with CAD software a lot and I appreciate the eye candy as a visual indicator. Then again, if it were up to me we'd toss all the CAD software and hardware and go back to board drafting - less "it's easy to revise because it's on the computer so let's do it a lot" attitude and more forethought required when designing.

    "Keeping up to speed" these days has more to do with updating one's computer knowledge quotient and not enough to do with actually doing real-world stuff and improving skills in the disciplines that we use computers to help us with in the first place.
  • by Hamsterdan ( 815291 ) on Friday January 14, 2005 @01:28AM (#11357833)
    Agreed. I've always wondered how a WinNT server would compare to a *NIX box if we would be allowed to boot it in CLI only. I mean, running a GUI on a *server* is pretty pointless. My file server doesn't even have a monitor (it's on NT, so kill me :p
  • by bombshelter13 ( 786671 ) on Friday January 14, 2005 @02:58AM (#11358613)
    It could be, just maybe, that Microsoft is doing something that, in some twisted, demented way, makes some sort of sense as a move to strengthen it's hold on it's primary market - the average, barely computer literate home user. Think about it. Longhorn comes out. All the typical, non-geek home users run out to buy a new computer so they can use Longhorn. Why? Because, as has been established so often, the average user is ~used to~ buying a newer computer sothey can run the newest version of windows. So, now they have Longhorn, and good god is it pretty. Unnecessarily pretty, yes. Inneficiently pretty, since it takes up all kinds of resources to keep running. But the average user is unaware these kinds of resources even exist, so what does he see? Damn, that's pretty... those are some sweet transparency effects, and don't you love all the neat little animations? Now, what happens when this same user sees someone running a Linux desktop? Even with the prettiest set of KDE themes and widgets you can find. I'll tell you what he thinks. He thinks, 'Hey, that doesn't look nearly as nice as my Windows box... it barely even looks any better than that old version of Windows (i.e. Windows XP)' and immediately dismisses Linux as being obsolete and 'old-fashioned' because 'look how much prettier Longhorn is, it must be more advanced'. A bit of a twisted thinking from Microsoft, but if you look at it like that, you have to admit it does kinda make sense.
  • by Lisandro ( 799651 ) on Friday January 14, 2005 @03:29AM (#11358823)
    IIRC, what diferentiated the Amigas was that you could not only mix multiple resolutions onscreen, but multiple resolutions with different bit depths, palettes, and even mouse cursors, which were drawn by hardware. This is from the top of my head, as i (sadly) never owned an Amiga and only fiddled when i saw friends who owed one, but i recall reading about that and be grossly impressed. It was truly a machine ahead of it's time.

    Anyone. feel free to correct me.
  • by Anonymous Coward on Friday January 14, 2005 @06:10AM (#11359635)
    Actually, the Amiga used its graphics power for all sorts of co-processing. Disk drive access, I believe, was helped along by the blitter somehow; general bit-level operations on data, etc. where done with it too. The OTHER graphics co-processor, the copper was also used for some non-graphics things if I recall correctly.

    Basically, the OS treated its graphics processors a computing resource, rather than just something to do graphics with, as I've been advocating that we do on Linux for a while now. A number of projects to use modern GPUs and/or DSPs as for (relatively) general processing exist now, but we need to do more, and use them for anything the OS *can* use them for.
  • Re:How silly (Score:3, Interesting)

    by barc0001 ( 173002 ) on Friday January 14, 2005 @02:31PM (#11364791)
    Oh yes. A $499 computer with no monitor. And the following spec:

    40GB HDD (4200 RPM no less!)
    256 MB RAM
    1.25 GHz processor
    Optical drive isn't a burner

    Spending $480 over at Dell (even though I don't like them much either) gets you:

    80 GB HDD
    512 MB RAM
    Celeron 2.6
    CDRW drive
    17" monitor!

    So the processor might be a bit pokier than the G4, but you get twice the storage, twice the memory, a burner, AND a display. And it's still $19 less than the Apple offering. So tell me again how this is competitive?

    Just because something is stereotypical doesn't mean it's incorrect. That's how stereotypes evolve.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...