Windows Longhorn to make Graphics Cards more Important 714
Renegade334 writes "The Inquirer has a story about MS Longhorn and its need for better than entry level graphics cards. This is due to the WGF (Windows Graphics Foundation) which will merge 2D and 3D graphics operations in one, and 3D menus and interfaces that require atleast Shader 2.0 compliant cards. Supposedly it will really affect the performance of the new Microsoft OS." This has been noted before in the system requirements for Longhorn, but it would seem the full impact is slowly being realized.
How silly (Score:5, Interesting)
This is due to the WGF (Windows Graphics Foundation) which will merge 2D and 3D graphics operations in one, and 3D menus and interfaces that require atleast Shader 2.0 compliant cards.
That's just plain stupid. Grandpa & Grandma want to check their email and pics of the grandkids, why on earth should they require a Radeon MegaXP293823-XtremeSLI+ to do that? I hope there's an option to disable all that cycle-wasting crud or MS may be shooting itself in the foot: how many offices will spend a few hundred dollars on individual video cards just to upgrade the OS? What about those machines with onboard video (ala Dell?)
Funny (Score:1, Interesting)
Translation: Microsoft inserts knife in back.
People are not going to buy longhorn if they need to buy a graphics card too. They will look for something else and find linux!
Cool (Score:2, Interesting)
I know we'll see a bunch of folks protesting bloat and other fud - but it'll be cool to see what they come up with with a home UI that strains a vid card.
Great, but. (Score:5, Interesting)
But... It doesn't matter how fast computers get, Windows Explorer Shell always seems to become less snappy, even on fresh installs. XP made the start menu slower than ever as it retrieves nonessential metadata on the shortcuts. Myriad Shell extensions, over time, bring the Explorer UI to a crawl.
Sexy is great, but I have to use it every day. It's just not worth making the UI dog even worse.
Prices (Score:3, Interesting)
Re:needless (rant mostly) (Score:3, Interesting)
I'd be surprised if they really went wild with 3d interfaces like the 'Jurassic Park' file browser, or the cube with web pages mapped on it that was posted here awhile ago. I think they are just going to do what Apple has already done and what Keith Packard is working on for X-Windows.
You are probably right. Microsoft will only use it for flashy effects. At least Apple eventually got to arguably useful things like Expose. But they had to put Quartz Extreme in place first before they could do it.
I'm not inclined to be charitable, but hopefully this is Microsoft laying the groundwork for interesting and useful user interface ideas.
Naaaah, that is too nice.
Microsoft never was good at copying Apple... (Score:5, Interesting)
Mac OS X 10.2 introduced "Quartz Extreme", which uses your graphics card to composite your screen. This meant that dragging windows around now required almost no CPU power at all. In 10.3, they introduced several 3-D effects to enhance the interface - most notably a rotating cube when you switch users.
There are two key points that Microsoft seems to be missing, though:
* Mac OS X looks exactly the same if you don't have a powerful enough graphics card, and screen redrawing is not too slow. Having a graphics card just makes the system more responsive because the CPU is doing less of the work.
* The system degrades gracefully - if you don't have a powerful enough graphics card or run out of video RAM, certain 3D transitions may be skipped. But everything will still function, and everything will look the same.
It's too early to tell, but it is starting to sound like Microsoft may be creating a new interface that requires a super graphics card, leaving those with only cheap integrated video with a completely different interface. To me that sounds like a recipe for tech support hell - novice users won't understand why their screen doesn't look like someone else's.
Re:How silly (Score:5, Interesting)
Now all we have to do, is pray they don't leave some loop hole open that lets someone burn your video card. Can you imagine, built in Windows overclocking?
*shudder*
call me crazy... (Score:2, Interesting)
Grandpa and grandma will be just fine on 2000 or xp, or...and here's the crazy part...even 98. My father in law still uses win3.freaking-1 on a 486, for Christ's sake. Grandpa and grandma will be just fine.
This will boost the market (Score:5, Interesting)
When XP came out my dad, a programmer for a large corporation, eventually bought a new computer from Dell with XP on it about a year ago. His previous system was a 350Mhz Dell. A programmer myself, my top system is a 1.2Ghz Duron running Win2K. I've had it for a couple years.
When Longhorn comes out it's time for an upgrade anyway and most people are going to buy prebuilt systems. Those prebuilt systems will have a (barely) sufficient graphics card.
GeForce FX 5500's are well under $100 already. In a couple years when Windows needs that kind of card to run, they'll be dirt cheap and onboard.
And it'll be just in time for when people are looking to upgrade their computer hardware anyway.
Complaining that MS is forcing upgrades is as silly as claiming ID Software forces hardware upgrades. I still use 2000, could use 98 if I wanted. I could also play Wolfenstein 3D and stick to a 386. Something needs to drive the market. If there was no need for better hardware, there'd be no better hardware. It's all artificially driven anyway. There's no objective reason why we need fancy pants graphics in any software. There's no objective reason we need high quality, drive space/CPU/Memory eating, audio/video.
In short, who cares that MS is making greater graphics demands for it's OS? They've done this with every release. Even Linux is making greater and greater demands. If you want the all the graphics pizzaz of Windows 3.11, use Windows 3.11. Some of us like an OS that looks "pretty."
If you want a plain text OS, then use DOS or ditch the GUI of Linux and have fun.
Well... (Score:5, Interesting)
But the real question is: why are pixel shaders needed? Unless you're doing strange reflections or simulating bumps or playing around with reflectivity in realtime, I can't imagine a use for them. I certainly can't see why you'd need anything more than simple textured quads or triangles. Oh, and some sort of alpha support for shadows. All of that sounds like a TNT2-era card, like the one I used to use to do Quake II.
What this really feels like is Microsoft pushing hardware adoption again. Ever notice how new motherboards don't come with USB drivers for Windows XP? How you have to upgrade to the latest service pack to get USB support? Partly piracy curbing, and partly I think to keep a hold by forcing people to use approved hardware.
Re:KDE should use this in their advertising (Score:1, Interesting)
Re:Funny (Score:4, Interesting)
Just as
The reason why most wifi hardware doesn't work in Linux isn't a lack of trying. It's that hardware manufacturers GO OUT OF THEIR WAY to not support Linux.
For example, my friend got a "v4 Linksys" 802.11b card [iirc it was Linksys....] and found out that only the v3 card works in Linux.
Similarly the "SoundMAX" cmpci asus chipset [at least when first introduced] was purposefully different from the original cmpci chipset [and didn't work at least in the 2.4 kernels].
So it's not that Linux developers don't develop drivers [or try to] it's that hardware developers change specs and don't document things.
In the future just say "Linksys doesn't support their customers [*]" instead of saying "Linux doesn't support Linksys".
[*] Any BS about not being enough Linux users is just stupid. The benefit from taking the time to write competent Linux drivers [or just release the specs] would far outweigh the cost of doing so.
Tom
Re:Welcome to the Present (Score:5, Interesting)
Mmm, no. Commodore was the first to really do this. The original Amiga had native graphics capabilties that still aren't available (like multiple resolutions onscreen) in PC hardware. The OS used them, and used them well. When a more advanced Amiga came with more graphics capabilities, the OS automatically configured them and used them as well. Apple was me too, much later. :)
But that's OK. Apple knows how to market -- that more than makes up for coming expensive, late and/or weakly with a number of things. Plus they provide a really nice end user experience.
Re:Welcome to the Present (Score:3, Interesting)
Re:How silly (Score:3, Interesting)
You, my friend, have some other problem with your system. Or you're flat out trolling. I use themes on XP on a 667MHz P3 w/ 384 megs of RAM with absolutely no trouble.
Broken window fallacy (Score:3, Interesting)
Wikipedia: Broken Window Fallacy [wikipedia.org]
Re:Welcome to the Present (Score:5, Interesting)
Hey, I *like* drop shadows and semi-transparency on menus and the like, it provides a "rich" environment and also helps to prioritize open windows. Perhaps you are a command line guru, I work with CAD software a lot and I appreciate the eye candy as a visual indicator. Then again, if it were up to me we'd toss all the CAD software and hardware and go back to board drafting - less "it's easy to revise because it's on the computer so let's do it a lot" attitude and more forethought required when designing.
"Keeping up to speed" these days has more to do with updating one's computer knowledge quotient and not enough to do with actually doing real-world stuff and improving skills in the disciplines that we use computers to help us with in the first place.
Re:Boosting performance on Windows (Score:2, Interesting)
Strange logic, but logic nonetheless. (Score:2, Interesting)
Re:Welcome to the Present (Score:3, Interesting)
Anyone. feel free to correct me.
Amiga was ahead of even this. (Score:1, Interesting)
Basically, the OS treated its graphics processors a computing resource, rather than just something to do graphics with, as I've been advocating that we do on Linux for a while now. A number of projects to use modern GPUs and/or DSPs as for (relatively) general processing exist now, but we need to do more, and use them for anything the OS *can* use them for.
Re:How silly (Score:3, Interesting)
40GB HDD (4200 RPM no less!)
256 MB RAM
1.25 GHz processor
Optical drive isn't a burner
Spending $480 over at Dell (even though I don't like them much either) gets you:
80 GB HDD
512 MB RAM
Celeron 2.6
CDRW drive
17" monitor!
So the processor might be a bit pokier than the G4, but you get twice the storage, twice the memory, a burner, AND a display. And it's still $19 less than the Apple offering. So tell me again how this is competitive?
Just because something is stereotypical doesn't mean it's incorrect. That's how stereotypes evolve.