Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Windows Operating Systems Graphics Software

Windows Longhorn to make Graphics Cards more Important 714 714

Renegade334 writes "The Inquirer has a story about MS Longhorn and its need for better than entry level graphics cards. This is due to the WGF (Windows Graphics Foundation) which will merge 2D and 3D graphics operations in one, and 3D menus and interfaces that require atleast Shader 2.0 compliant cards. Supposedly it will really affect the performance of the new Microsoft OS." This has been noted before in the system requirements for Longhorn, but it would seem the full impact is slowly being realized.
This discussion has been archived. No new comments can be posted.

Windows Longhorn to make Graphics Cards more Important

Comments Filter:
  • not so much impact (Score:5, Informative)

    by diegocgteleline.es (653730) on Thursday January 13, 2005 @10:22PM (#11355585)
    IIRC, longhorn installer will check your graphics card (if it's lower than X fps then...) and will enable or disable 3D functions depending on if you've a good or bad graphics card

    In short: the "3d mode" it won't be the one available. There will be a much lighter desktop available (somewhat like current XP or something like that, you'll miss all the 3d stuff but...)
  • Re:3D Interfaces? (Score:5, Informative)

    by akac (571059) on Thursday January 13, 2005 @10:32PM (#11355696) Homepage
    No, not 3D interfaces in the way you're thinking. Think of it this way: every window is now an DirectX object. No need for redrawing by an app. Since every window is now a 3D object (one with only one pixel depth), you can do simple things like moving all the maintenance of a windows' DC from the app itself to the OS.

    That's what Quartz Extreme does on OS X. This is just Quartz Extreme on PC.
  • Not just eye candy (Score:4, Informative)

    by miyako (632510) <miyako@NOsPAM.gmail.com> on Thursday January 13, 2005 @10:44PM (#11355789) Homepage Journal
    I'm anticipating that a lot of people are going to bitch and moan about how it's pointless eyecandy, but if Microsoft is able to do what Apple has been doing, then it could really add to the UI.
    Things like expose and translucent windows can come in amazlingly handy in OS X (I've never found anything quite as useful as transparent terminal windows in OS X allowing me to have code open in one window, and documentation in the window behind it, and look through the code window to read documentation, especially when working with an API your not familiar with).
    I think that as 3D accelerated UIs become more common, we'll see even more useful features popping up. It's not like there is any good reason for new computer to have a video card that won't run this, and the type of person who would upgrade would probably either already have a newer videocard anyway.
    I just wish this would make it into X, but alas I suspect that it's the sort of thing that might take a while to get properly implemented and supported.
  • by Thu25245 (801369) on Thursday January 13, 2005 @10:56PM (#11355897)
    In the PC market, the real "entry-level" machines have "Integrated Intel Extreme(TM)" graphics on the mobo. Which is a polite way of saying, "no graphics card at all." So, yes, a Radeon 9200 is entry level for a graphics card, but it's a nice step up from what you get standard on the cheapest machines.

    If Microsoft is complaining about the performance of graphics hardware on low-end PCs, it's a solid bet that the integrated graphics cards will be the first target.
  • by mattyrobinson69 (751521) on Thursday January 13, 2005 @10:58PM (#11355916)
    no, it would be quicker, as the graphics card would be doing the window drawing and whatnot, freeing up the processor for what ever your running in your xterms, i mean cmd.exe windows
  • Re:Great, but. (Score:5, Informative)

    by bogie (31020) on Thursday January 13, 2005 @11:20PM (#11356071) Journal
    That is the $64 question isn't it? Can Microsoft learn to make an OS that doesn't slow down massively over time. I just did a fresh install on my one machine that runs XP and its night and day. Over time XP just gets slower and slower. Of course the battle cry for MS defenders is "its the fault of 3rd party drivers and apps". Well, then make freaking OS that doesn't let "3rd party" apps run it into the ground. Why do I even need to use an app's uninstaller? Why by default doesn't XP know exactly how to remove every last bit of registry crap that got shoved in there in the first place? How come it take 10 minutes for the start menu to come up after I've been using the OS for a while? How come many explorer operations still lock up the OS and stop whatever work you doing cold? When will MS make an OS that you can actually multitask on no matter what's going on in the background? MS has a lot of work to do and somehow I get the feeling that they haven't learned their lessons yet.
  • by figleaf (672550) on Thursday January 13, 2005 @11:21PM (#11356096) Homepage
    First of all the source of the aritcle is the Inquirer. Which is know for deliberabely twisting news. It therefore not a credible source of information.

    Second If you have closely followed Microsoft previous statements at WinHec and in MSDN articles you would knbow that Longhorn will provide XP style rendering on older graphics cards. Systems with newer graphic chips will have full 3D accelerated graphics thereby taking the rendering work away from the CPU and improving performance.

  • Re:Lobby (Score:3, Informative)

    by FuzzieNorn (203503) <fuzzie@@@warpedgames...com> on Thursday January 13, 2005 @11:26PM (#11356143) Homepage
    A Radeon 9200 doesn't support PS2.0.
  • Re:No biggie. (Score:2, Informative)

    by loyukfai (837795) on Thursday January 13, 2005 @11:43PM (#11356412)

    Suppose Longhorn is going to be released in 2006, it's very likely that alot of boxes will still have no PS 2.0 support. Remember a lot of boxes sold in recent years have integrated graphics, and most of those integrated graphics don't support PS 2.0.

    Of course, by MS's tradition, one will probably fall back to classic mode. But then, many people don't know how to configure it, so it will be better if the installer configures this automatically or at least easier.

    Also, since the demos will probably be run on some very capable hardware and demostrate these nifties. It could disappoint upgraders after they found out they need to upgrade their graphics after they bought the copies. Not so good for corporate image I guess.

    So it probably won't be a big issue, but neither a non-issue.

    Frankly, I can't wait to see this. All that GPU power of my 9800 is basically being wasted 99.99999999% of the time right now.

    I think, maybe you shouldn't have gotten a 9800 then...? @_@

    BTW, for those who want/need to know which chips support PS 2.0, try this [bitmanagement.com] and this [beyond3d.com].

  • Re:Is this necessary (Score:3, Informative)

    by PyroMosh (287149) on Thursday January 13, 2005 @11:50PM (#11356541) Homepage
    If you have powerpoint installed, check this out [microsoft.com]. It's a fairly in depth discussion on Longhorn with emphesis on the new Windows Graphics Foundation.

    If not, I'll sumarize. Or you can google for essentially the same info, but this powerpoint file is well done.

    One of the goals of longhorn is to further the requirements of signed drivers, and to offload the complexity of drivers into the new WGF. The idea being that it's better to have MS write the code once well, than to have lots of third party vendors wring the same code over and over again, some better than others.

    This means:
    - Less complex (and therefore more stable) drivers
    - Only signed drivers will run (I'm skeptical that they'll keep this requirement)
    - Less processor overhead
    - Most drivers will run in user-mode pretty much all of the time, which further means that:
    - A crash will only take down the current process.
    - Beter performance, since there will be one less layer between software and the hardware that's running it.
    - Crashes will be much more rare, and when they do happen, will (if executed correctly) be transparent to the user. The system will recover from a crash, and many times the user will not even be aware that an error occured.
  • by Junks Jerzey (54586) on Friday January 14, 2005 @12:13AM (#11356862)
    Mmm, no. Commodore was the first to really do this. The original Amiga had native graphics capabilties that still aren't available (like multiple resolutions onscreen) in PC hardware.

    In the interest of historical accuracy, the Atari 400 and 800, first publicly available in 1979 (six years before the Amiga), allowed mixing multiple resolutions on screen. You built a display list of modes and the hardware interpreted them. You could mix text, graphics, and various resolutions of each. You could also trigger interrupts to occur on a specific display list command.
  • by Xyde (415798) <`ten.rrrrup' `ta' `todhsals'> on Friday January 14, 2005 @12:18AM (#11356936)
    The iBook 500 was a horrible, horrible little machine.

    With it's crippled 66mhz system bus (even the iMac 350 was 100mhz) and it's woeful ATi Rage 128 8MB, it is quite a poor performer under OS X. You can overclock them to 600 on a 100mhz system bus with no issues and they perform far, far better.
  • Re:How silly (Score:5, Informative)

    by tc (93768) on Friday January 14, 2005 @01:13AM (#11357662)
    Okay, cluehammer time:

    First, the GPU is the processing unit, the framebuffer is the memory where the bits are stored. Both are involved in any kind of rendering operation, 2D or 3D. The GPU operates on the bits on the framebuffer.

    Second, modern graphics devices don't have any dedicated 2D hardware left in them. They all just use their 3D cores to do basic blit operations. Why waste silicon on specialist 2D blitting when you've got a gajillion megapixels of fillrate sitting right there in the 3D core?

    Third, you are obviously unaware of how modern shader technology works. If I want to stream down 2D coordinates then I can do that just fine. In fact, shaders don't really care what all the numbers are, they just know that they are getting a certain number of inputs. If you choose to write a shader program that interprets them as coordinates to be transformed, then that's merely the common convention. Heck, I could just stream down 1D coordinates if I wanted to (actually, this is genuinely useful, if the coordinate is time and the shader is computing, say, a particle system). So there is really no inefficiency in using the 3D core to do 2D operations, because I can just transmit the minimum amount of data necessary by means of a suitably chosen shader.
  • by nathanh (1214) on Friday January 14, 2005 @01:22AM (#11357759) Homepage
    Things like expose and translucent windows can come in amazlingly handy in OS X (I've never found anything quite as useful as transparent terminal windows in OS X allowing me to have code open in one window, and documentation in the window behind it, and look through the code window to read documentation, especially when working with an API your not familiar with).

    I just wish this would make it into X, but alas I suspect that it's the sort of thing that might take a while to get properly implemented and supported.

    I've been using translucent windows and compositing on my Linux desktop for months. It's part of Xorg. Yes, it is hardware accelerated. Yes, it is faster. Yes, it looks cool. Yes, it works today.

  • by ldesegur (547464) on Friday January 14, 2005 @03:05AM (#11358660)
    To be really precise, the Apple ][ in 1977 has a mixed mode with 192x160 hires graphics on top of a 4 lines of 40 columns ascii display. That was called mixed mode.
  • by Jameth (664111) on Friday January 14, 2005 @03:07AM (#11358667)
    Just in case you actually cared, KDE 4 will be able to use a pixel shader for rendering the menu. And, assuming both KDE 4 and Longhorn are on time, KDE 4 will come out first. And, seeing as KDE has made its last several releases to within a few weeks, it seems likely that at least KDE will be on time.

    So, overall, I quite agree with you. Those slackers over at MS have some real explaining to do about why they'll be the last OS to have any real hardware acceleration.
  • Re:Great, but. (Score:2, Informative)

    by thomasweber (757387) on Friday January 14, 2005 @03:28AM (#11358818)
    Or using apt-get to uninstall? There's ALWAYS remnants.

    Well, at least for .debs, this is a bug. Period. A package might leave something behind in my home directory (personal settings,...), but everything on the system itself muss be cleared, if I purge the package (You knov the difference between apt-get remove and apt-get --purge remove?).
  • by Anonymous Coward on Friday January 14, 2005 @04:20AM (#11359124)
    Ofcourse if you would have quoted or read the whole paragraph you would have noticed this:

    "The performance gains and features supported by Core Image ultimately depend on the graphics card. Graphics cards capable of pixel-level programming deliver the best performance. But Core Image automatically scales as appropriate for systems with older graphics cards, for compatibility with any Tiger-compatible Mac."

    Emphasis on the last paragraph mine.
  • by Dal Platinum (829197) on Friday January 14, 2005 @05:34AM (#11359499)
    A couple of things that seem to have passed you by:

    Secretaries don't 'just' write memos. Only an ignorant person would think this. Also, they have to read memos from other people/companies, most of whom will be using Word.

    Secretaries at most companies won't be upgrading to Longhorn immediately. There is no need for them to do so. Everything they need can be done with 2000/XP. By the time they get around to getting a longhorn-enabled computer, shader 2.0 hardware will be relatively inexpensive. You're thinking of current top-end cards, whereas, in reality, they will be entry level by the time most people upgrade to Longhorn. (not upgrading is also a possibility)

    If you think moving all the GUI rendering to the GPU will negatively affect the performance of the CPU, you've been reading the wrong books. What this will do is remove the load from the CPU, thus making it more fficient.
  • by Anonymous Coward on Friday January 14, 2005 @06:06AM (#11359620)
    like multiple resolutions onscreen

    I've seen this mentioned numerous times as a "world first" for Commodore, but the Acorn BBC could do it many years before. AcornSoft Elite anyone? That really annoys the Commodore and Amiga fanboys.

    Even better, is that the Atari ST could do it as well, just toggle set low res at the start of a vertical blank, and change to medium res later in a horizontal interrupt. Now that really, really, really annoys 'em

  • by zerojoker (812874) on Friday January 14, 2005 @07:24AM (#11359915)
    http://www.microsoft.com/whdc/device/display/graph ics-reqs.mspx They describe the graphic features as follows: ---- For Longhorn, graphics requirements for desktop experiences are defined in relation to differentiated experiences: Aero Glass experience: Delivers the full-fidelity Longhorn user experience on the desktop, including support for 3D graphics and animation. Aero experience: Delivers the minimum hardware acceleration and desktop composition for the Longhorn user experience. Classic experience: Equivalent to Windows 2000 capabilities, using software rendering. --- so I think it will be no problem just to switch off those nice graphic effects if you don't have such a powerfull graphic hardware. I really don't like Microsoft at all but I have read a lot of those "Longhorn will need a something like a cray" articles and all those articles were written by authors who seemed to not really well informed about technical details...

Machines certainly can solve problems, store information, correlate, and play games -- but not with pleasure. -- Leo Rosten

Working...