Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software

ATI vs. NVIDIA: ATI Steals the Show 484

(54)T-Dub writes "Apparently a group of MIT engineers made an OpenGL wrapper for the NVidia Demo of 'Dawn.' (a fairy with high sex appeal) Even though the wrapper adds more overhead the demo still runs faster on the 9800pro and creates higher quality images." Yet another reason it's good to have engineering students on your side.
This discussion has been archived. No new comments can be posted.

ATI vs. NVIDIA: ATI Steals the Show

Comments Filter:
  • OUCH (Score:5, Interesting)

    by KDan ( 90353 ) on Friday May 23, 2003 @07:04AM (#6022652) Homepage
    That's gotta hurt... I feel sorry for nVidia... it does look like they're going the way of 3dfx... Maybe I should buy an ATI card next. nVidia do have good linux driver support, though - does ATI have that too?

    Daniel
    • Re:OUCH (Score:5, Insightful)

      by mharris007 ( 142886 ) on Friday May 23, 2003 @07:08AM (#6022676) Homepage
      That might be true, if they weren't venturing into any other computer market other than graphics. Considering they got one of the hottest chipsets (nForce 2) for AMD CPUs right now, I don't see them going completely downhill in all their computer markets.
    • by fluor2 ( 242824 ) on Friday May 23, 2003 @07:17AM (#6022721)
      nvidia practically destroyed 3dfx. They bought 3dfx and refused to update any drivers (e.g. nvidia refused to update the windows xp 3dfx-driver when microsoft contacted nvidia). Thus many people had to buy a new geforce card.

      i fear we all (we who had to buy geforce cards to get windows xp working/or people that heard about this story) have established a real HATE relationship against nvidia.

      go go ati. best wishes!
      • by zeno_2 ( 518291 ) on Friday May 23, 2003 @09:49AM (#6023733)
        nvidia practically destroyed 3dfx. They bought 3dfx and refused to update any drivers (e.g. nvidia refused to update the windows xp 3dfx-driver when microsoft contacted nvidia). Thus many people had to buy a new geforce card.

        Im pretty sure nvidia bought intellectual property from 3dfx, not a contract that said they had to update any drivers. They bought technology, not a promise to support cards that they didn't make. People bought Geforce cards because they were technically better, not because of a big conspiracy.

        i fear we all (we who had to buy geforce cards to get windows xp working/or people that heard about this story) have established a real HATE relationship against nvidia.

        I have a voodoo 5 card, that card is really only usable under win98. I dont hate nvidia for this, actually I dont *hate* anyone for it (hating someone for that is pretty silly think). Times change, technology changes, and I welcome all of it. Windows XP isn't that good anyway, I would't try to hard to get it to work =).

      • by RatBastard ( 949 ) on Friday May 23, 2003 @11:40AM (#6024812) Homepage
        3DFX was dead before NVidia bought them. It wasn't NVidia that killed 3DFX. 3DFX did that all by themselves. Between unreasonable product delays and abysmal financial mismanagement, they doomed themselves.
    • Financials (Score:5, Insightful)

      by gwappo ( 612511 ) on Friday May 23, 2003 @07:28AM (#6022760)
      nVidia, unlike 3DFX at the time, has a huge pile of cash sitting on their balance sheet.

      Although its engineers need to learn to ignore their marketing dept. the management of nVidia is pretty good

      expect them to regain the crown at the NV40 marker, ATI has indicated they'd be slowing their innovation cycles, whereas nVidia has made no such statement.

      • Re:Financials (Score:5, Interesting)

        by thrillseeker ( 518224 ) on Friday May 23, 2003 @08:37AM (#6023193)
        nVidia, unlike 3DFX at the time, has a huge pile of cash sitting on their balance sheet.

        A huge pile of cash is only indicative of the past - during the heyday of the internet bubble there were dozens of companies with huge piles of cash - a year ago United Airlines had billions of dollars in the bank too - they've barely managed to emerge from bankruptcy, and there's still much work ahead.

        What matters is that a company knows what to do with the assets they have. It's awfully easy for a company to get lazy and make money in spite of themselves when everyone else is doing so too - the question is can they continue to do so when times are tough.

        ATI has indicated they'd be slowing their innovation cycles, whereas nVidia has made no such statement.

        ... and one of them is more likely than the other to have correctly gauged the future buying needs of their customers. If the future has people less willing to pay for high-end video cards, which is a good possibility if it tracks the steadily declining sales of PCs overall, then spending megabucks to build high-end hardware is not going to do much good for the bottom line.

        • Re:Financials (Score:5, Interesting)

          by gwappo ( 612511 ) on Friday May 23, 2003 @11:02AM (#6024421)
          A huge pile of cash is only indicative of the past - during the heyday of the internet bubble there were dozens of companies with huge piles of cash - a year ago United Airlines had billions of dollars in the bank too - they've barely managed to emerge from bankruptcy, and there's still much work ahead.

          For nVidia, its pile of cash is definitely only indicative of the past. However, in the internet bubble-days, a lot of companies had cash not because of earnings, but because of overhyped equity. Most of us know how to quickly spend a lot of money, nVidia has proven, in a most competitor-bloodspilling manner, that it can make it.

          In addition, take a look at all those engineers from Silicon Graphics - when Silicon Graphics made its ill move towards "normal" servers and away from its graphics niche, all those engineers abandoned ship and signed up with nVidia.

          Compare that with 3DFX, shortly before they went bankrupt they had purchased the tiny GigaPixel firm for (iirc) 1Billion USD in hopes of grabbing the Microsoft XBox contract which was at that time assigned to Gigapixel, before then, they wasted money on buying STB (don't even remember for how much there).

          I cannot comment too much on Airlines since I don't track them too much (fully commoditized, too heavily regulated, no serious money to be had in any of its niche markets).

          ... and one of them is more likely than the other to have correctly gauged the future buying needs of their customers.

          A friend of mine showed me a "handy-cam demo" of Half-Life 2, and its engine. In that demo we see the most sophisticated shaders (refraction, translucency, reflection) around, physics engine, you name it -- aside from wondering about how that much content (a _lot_ of detail in that game) could still be profitable for a gameshop, the other conclusion was that I'll probably be needing a new graphics card soon to get a decent frame rate. That kind of engine puts a heavy demand on both GPU (shaders, graphics) and CPU (physics), and I personally strongly believe that this type of engine will drive future demand for graphics cards.

          Imho, The future for graphics is only in its infancy; after the z-buffer paradigm, we still have raytracing to explore. The possibilities are truly endless.

      • NV40? Try R400 (Score:3, Interesting)

        by aliens ( 90441 )
        I do not think the NV40 will be out any time soon as they are marketting the NV35 as the card built for Doom III.

        ATI on the other hand has the R400 coming down the this holiday season. I've been wowed by the R300/350 based cores and can't wait until laptops start getting Mobility 9600's. ATI could drop the ball of course, just like Nvidia did with the NV30, that would be an interesting race at that point. ATI would have to hope their R450 could beat out the unreleased NV40.

        Either way, we can only benefit.
    • Re:OUCH (Score:5, Informative)

      by Sentry21 ( 8183 ) on Friday May 23, 2003 @07:47AM (#6022850) Journal
      ATI has both closed-source drivers that really expose the abilities of the hardware, and open-source drivers which are quite reliable and not problematic at all (in my experience). Radeon support for DRI comes standard in the kernel too, as does framebuffer text console support.

      So yeah, they do.

      --Dan
    • Re:OUCH (Score:4, Informative)

      by samhalliday ( 653858 ) on Friday May 23, 2003 @09:51AM (#6023741) Homepage Journal
      nVidia do have good linux driver support, though - does ATI have that too?

      hmm... i would argue, i would say NVidia has terrible support of anything other than windows, simply because the drivers are binary which causes no end of hassle if my kernel/XFree86/opengl apps are in any way different to what versions they considered or have compiled with. if you run gnu/linux or freebsd its often a pain trying to get the graphics card to work after an upgrade (ok, the kernel upgrade is not as bad as an xfree86 upgrade since they do give a few headers in the tarball, but i heard they will soon stop that). NVidia refuse to release specs of the cards so noone can even come along and write the GPL'ed/OSS'ed drivers

      ATI on the other hand release binary drivers (i heard they are good, but i dont know anyone who uses them...), specs and also have people contributing to the development of XFree86's own drivers and the corresponding kernel driverrs for linux and freebsd. if i find a bug, i can write the authors with the details; they usually fix it in CVS and the next day i have better driver. when i used to have an NVidia card i found a bug, emailed nvidia about it and got some auto-response thing which pretty much showed that i was being ignored... the next update didnt even have a fix. now, its not liek they dont claim to support gnu/linux, and its not like i hadnt paid for my hardware... so i felt shafted and bought a radeon.

    • by Daniel Phillips ( 238627 ) on Friday May 23, 2003 @11:48AM (#6024903)
      Like for example:
      NVidia is pleased to announce we published the detailed N30 interface specs today, available
      here [fly.com]. Come and get 'em!
  • Shame (Score:5, Insightful)

    by wiggys ( 621350 ) on Friday May 23, 2003 @07:05AM (#6022655)
    What's the betting that from now on all nVidia demos will have some code to check that they are running on nVidia hardware?
  • by jkrise ( 535370 ) on Friday May 23, 2003 @07:06AM (#6022663) Journal
    That wud add LOT more sex appeal.
    • by Anonymous Coward on Friday May 23, 2003 @07:37AM (#6022805)
      Dawn nude patch here [3dnews.ru]

      or alternatively type:

      "View" mode without wings rename fairy.exe to 3dmark03.exe
      "View" mode with wings rename fairy.exe to quake3.exe


      Posted as AC to avoid being viewed as a perv. /* Shout outs to my linux kernel developers in the mean streets of Brooklyn. */
  • Fairies? (Score:5, Interesting)

    by IamTheRealMike ( 537420 ) on Friday May 23, 2003 @07:11AM (#6022687)
    What is it with graphics card demos and fairies? I know they're written almost exclusively by young guys, but still - the Matrox G400 had a rather curvaceous fairy for their tech demo as well. OK, so obviously not quite as realistic as this one, but couldn't they be more imaginative?

    I mean if you're going to have medieval fantasy characters in your demos, might as well go the whole way and have a proper Princess.

  • by keller999 ( 589112 ) on Friday May 23, 2003 @07:12AM (#6022696)
    You know, when a page gets slashdotted, isn't it just coincidence when the only thing that will load is the banner ad!
  • trend? (Score:5, Insightful)

    by graveyhead ( 210996 ) <fletch AT fletchtronics DOT net> on Friday May 23, 2003 @07:13AM (#6022703)
    1) Get to the top of the video card market.
    2) Get lazy.
    3) Competitor gets to the top of video card market.

    Rinse, lather, repeat as necessary.

    Didn't NVIDIA learn from 3DFX? Hell, they bought them. I'm hoping this is a driver issue and that subsequent optimized releases of Detinator will speed it up. If not, it is a sad day for NVidia.
    • Re:trend? (Score:2, Informative)

      by Anonymous Coward
      I find it quite funny that most posters doesn't have any idea about the graphics market.

      Here is the lates marketshare data from Q1-2003 [globetechnology.com]:
      Whole market:
      NVIDIA: 31%
      Intel: 29%
      ATI: 19%

      Desktop standalone:
      NVIDIA: 64%
      ATI: 28%

      "I'm hoping this is a driver issue and that subsequent optimized releases of Detinator will speed it up. If not, it is a sad day for NVidia."

      What is a driver issue? Article talks about that with an OpenGL wrapper Dawn demo can be run on some ATI cards, nothing more. Slashdot summary is quite
    • by gwappo ( 612511 ) on Friday May 23, 2003 @07:32AM (#6022788)
      nVidia did not get lazy, instead, they did the smart thing and diversified, which is why we now have nVidia in the chipset market.

      Good move at the time, and a good move now as it will allow them to bridge the poor comparative performance of their graphics unit vs. ATI.

    • That's competition for ya. But don't expect nVidia to stay down for long; nothing's stopping them from developing faster cards, and eventually they'll be back on top.
    • Maybe this way of doing business was patented by 3Dfx. That's why NVidia bought them.
  • Dawn (Score:3, Funny)

    by Whatthehellever ( 93572 ) * on Friday May 23, 2003 @07:14AM (#6022712) Homepage
    ...does she have a sister?
  • idea (Score:5, Insightful)

    by wiggys ( 621350 ) on Friday May 23, 2003 @07:15AM (#6022714)
    It's pretty silly paying through the nose for an expensive new graphics card when the only thing you can "play" are the tech demos.

    Sure, the card is faster so you can now play your existing games with anti-aliasing on all the time (well, mostof the time...) but unfortunately your games won't look any different.

    Remember the first time you installed your 3dfx card (inc pass-through cable) and played GLQuake? Amazing! High res smooth graphics on your P166, the envy of consoles everywhere. Then nVidia brough our their TNT cards which did 32-bit colour... nice. But since then, what's changed? Answer: not much. There are only a handful of games which use 50% of the features offered by a Geforce 3. I have a Ti4600 and a Ti200 and it's nigh on impossible to tell them apart.

    Why the Sam Hill should I buy ANOTHER new card when there's simply no compelling reason to upgrade?

    • Re:idea (Score:4, Insightful)

      by Chicane-UK ( 455253 ) <chicane-uk@ntlwor l d . c om> on Friday May 23, 2003 @07:26AM (#6022749) Homepage
      Well.. its not an alien concept to think that some people just like to have the latest hardware to be able to run the latest games as fast & at as high a resolution as possible.

      I have a GeForce 4 Ti4400.. don't get me wrong, it is a quick card but there is a lot better out there.

      Yeah, it sucks that I can't afford to have the latest Radeon 9800 Pro or whatever, but I can understand why people would want them.

      I wouldn't mind a 9800 ready for Doom III and Half Life II though ;)
    • I know why! (Score:2, Informative)

      by Anonymous Coward
      Why the Sam Hill should I buy ANOTHER new card when there's simply no compelling reason to upgrade?


      Ah ha!

      Here in lies the challenges of Sales and Marketing departments all across the land.

      Unfortunately, they will find a way to convince you that you MUST get the new Happy Graphics 10,700 GF5x Twin-Turbo Platnium card.

      I think it is the extra LEDs or the fancy new second generation heat spreader that is there to cool the PCB.. because umm, that's uncooled so far!

      But really, if you want to make a conspea
    • I recently upgraded my GeForce 2 MX400 to a Radeon 8500. Not so that I could play games (although playing Serious Sam at 1600x1200 smoothly is nice) but so it could drive my dual monitor setup at 1600x1200@85 x2. I.e. it had a faster RAMDAC. My experience of the two is that the ATi drivers are more stable, but that the nVidia ones are more polished. As for the 3d capability? Well, the newer one is faster, but I really can't bring myself to care.

    • I know I'm not the only person out there building a new computer in a few months for the launch of doom3.
    • Re:idea (Score:5, Informative)

      by Chris Burke ( 6130 ) on Friday May 23, 2003 @10:54AM (#6024329) Homepage
      Yeah, I remember the first time I installed a 3dfx card and played GL Quake. "Holy shit" I think were my first words, followed soon after by "Holy fucking shit". I'm fairly confident that nothing will ever recreate that experience. Going from 320x200, 256 color with speed-optimized (accuracy de-optimized) software rendering at 15-20 fps to 640x480, 16-bit color with all the goodness of OpenGL (transparency, reflections, good dynamic lighting) at 30+ fps was a revelation. It's probably going to be incremental improvements from here on out, and no mere increment will ever have the sheer impact that the first 3 seconds of glQuake had.

      That said, I'm still pretty impressed with the graphical improvement since then. Compare glQuake to... Well, anything very recent that uses pixel and vertex shaders. Quake3 became a lot prettier (and faster) when I switched from a Voodoo3 to a GeForce 3. Newer games look even better (rain and water effects with pixel shaders look great). Add more trangle-processing power for more detail and the ability to have dynamic terrain (like waves in water)... I'd say we're progressing nicely.

      No, there's no reason to buy a GF4 if you have a GF3. Maybe a GF5 or whatever ATI has at the time. Skip a couple generations so that your new card really -does- have a noticeable advantage over the old, and you'll be much happier.
  • by Organic_Info ( 208739 ) on Friday May 23, 2003 @07:16AM (#6022717)
    Summary of next 500 posts.

    nVidia is dying...
    No their not..
    ATi linux drivers suck..
    Ati Windows drivers suck..
    No they don't....
    Benchmarks mean nothing...
    Blah Blah Blah Blah Blah

    Stop wasting your time...everyone buys their graphics cards based on what features are important to them. Whether it is raw performance, quality, driver stability, support, supported OS's, cost, availability etc... it is most likely a comprised mix of all of the above.
  • Graphics Wars (Score:5, Interesting)

    by Anonymous Coward on Friday May 23, 2003 @07:18AM (#6022727)
    I guess I too follow these stories closely, as I'm a fairly avid gamer. However, all this tugging between which graphics chipset company is better is likely to go on for some time. People say that NVidia is now the clear cut loser. Well, I wouldn't be so fast to judge that. I believe that like many other industries, there are two or more powerful companies that have good years and bad years when compaired against their competitor.

    It seems that NVidia is having a bad year or two right now. But they're a big and resoucre rich company, hopefully know what they are doing, and were just caught off guard by ATi pushing the technology in the 9700 of last year to market so soon. I think they are operating in a "catch up" mode, desperately trying to caputre back the coveted crown of the graphics wars. And it is that preoccupation that is deriding them from true technolgical innovation.

    Once NVidia resumes its roll as a technology innovator, much like ATi is now, the race once again will be on for the true champion of the graphics wars.

    And when that happens, I think that is a definate signal that graphics will again become sort of the arms race of the cold war, each side battling to be "best". But better, because when you have two free market companies battling it out, it usually comes down to as much blistering performance avalible for the dollar.

    And that is excellent for all gamers, and the general computing public at large.

    But maybe that's mumbo jumbo, it sounds good 5 hours past my bedtime. nighty night.
    • Re:Graphics Wars (Score:4, Insightful)

      by Falrick ( 528 ) on Friday May 23, 2003 @07:46AM (#6022841) Homepage
      One feather that nVidia has added to their hat in the last two years is their beginnings of diversification. No longer are the completely reliant on the consumer graphics market, what with their entry into the motherboard market. They have produced graphics chips for the XBox and have made a healthy entry into the mobile graphics chip realm. A couple (few?) years back they received a contract from the US Gov to produce graphics chips for displays in military jets (if memory serves).

      They have much more going for them then being purely a gaming chip company. Given time I suspect that we'll see nVidia and ATi oscilate the leadership position. nVidia, while in the valey, has other businesses to fall back on.

      While not always the case, companies with a backup plan tend to be more willing to take risks simply because if the risk doesn't pan out it doesn't spell disaster for the company. I think that we'll see more inovation coming from nVidia yet.
  • The last 3 cards I have purchased have been by NVidia, 2 G3's 32 megs and a G4Ti4200 64 meg. I recently built a new rig, and after months of comparing the latest and "greatest" Nvidia cards to the 6 month old 9700 pro, there was NO contest. The 9700 pro "pwnz" everything the GFX can throw at it. I'm a little dismayed by this, seeing as NVidia's driver support is outstanding compared to ATI's.

    My advice if you dont want to spend too much on a vid card: buy a 9500 pro, its light years beyond the 9600 pro, and
  • Does anybody have any mirror/BitTorrent URLs for that wrapper? Good ol' Slashdot effect at 7.22 AM...
  • by Blaster Jaack ( 536777 ) on Friday May 23, 2003 @07:22AM (#6022735)
    ATI vs. NVIDIA: ATI Steals the Show
    I dunno if you guys heard about this one but nVIDIA actually had a e3 party then went pretty wild [ezboard.com](some topless pics). Not only did they have Smashmouth perform at their E3 party but they also had some porn star make an appearance to(she was eventually kicked out for dancing topless on a table). Check google its all fact

  • I'd like to run the demo just to see it :)
  • Guts (Score:2, Interesting)

    by georgep77 ( 97111 )
    I wonder when a graphics card company will have the GUTS to release a demo that can run on ALL hardware and not just their own. Of course the company in question would need to have hardware greatly advanced from whatever else was available. Perhaps ATI could have done this when the R9700 was launched???

    Cheers,
    _GP_
  • by Anonymous Coward on Friday May 23, 2003 @07:29AM (#6022768)
    Yet another reason it's good to have engineering students on your side


    Yes, yet another reason. The other reasons I can think of are:

    1. you need a group of people to make seem a lot cooler than what you actually are
    2. you need to stop the flashing "12:00" on your VCR
    3. you need other trekies who actually want to wear the orange shirt
    4. avoid thermonuclear war


    Any other reasons to have them on our side?
  • Comment removed based on user account deletion
  • Myabe they never should have purchaced 3dfx :P

    Seriously though, I do have faith in nVidia, I'm sure they'll be able to engineer their way out of this. Eventualy.
  • Okay, I've got a Linux box here at work with a GeforceFX in it - I'm going to have a go at running it under WINE, but I won't be holding my breath... I'm assuming Dawn is an OpenGL demo?
  • Mirror (Score:5, Informative)

    by woodhouse ( 625329 ) on Friday May 23, 2003 @07:56AM (#6022886) Homepage
    If anyone wants it, I've mirrored the OpenGL wrapper here [pipex.com] (78k).

    You can get the dawn demo from here [nvidia.com]
  • by Anonymous Coward
    You'll have to pay 20% more for it: Pricewatch currently shows GeForce FX 5800 running for $326 [pricewatch.com] and up, while you won't find the Radeon 9800 for less than $394 [pricewatch.com]. So either way you're basically just getting what you pay for.
  • by execom ( 598566 ) on Friday May 23, 2003 @08:18AM (#6023048) Homepage Journal
    Again, this proves the superiority of Direct3D over OpenGL and the 'schism' between ATI and NVIDIA. Programming for OpenGL is not compatible, unless you handle all render path for all targets (take time). Such things wouldn't happens if the Dawn demo was Direct3D.This also proves that ATI could write drivers which can handle all the Nvidia OpenGL proprietary extensions (not what they doing actually). They are supporting a couple of extensions , but for example GL_NV_occlusion_query for example, is supported on my Radeon 8500. But I like to see GL_NV_point_sprites for examples (actually you can't do point sprites on ATI (ie particles) under OpenGL, except in Direct3D).Theses students should contact ATI and give the source code of their modifications for the next ATI driver.That, would be really nice and legal. These extensions are approved by OSI anyway. Sadely, for 'policy' reasons, it won't be accepted by ATI (I've already tried that in fact).
    ----------------
  • by ericvids ( 227598 ) on Friday May 23, 2003 @08:23AM (#6023077)
    The John had a .plan update before which pointed out that NVidia's card is actually cranking out more precision by default compared to ATI. That could explain why ATI runs faster by default. When he used an NVidia-specific rendering path (where the precision is more or less the same as ATI's), the NVidia card actually ran faster.

    Is this somehow related to the discrepancies in this fairy demo?
  • Nude patch (Score:5, Interesting)

    by Otis_INF ( 130595 ) on Friday May 23, 2003 @08:24AM (#6023093) Homepage
    I haven't tested this (I don't own an FX nor an ati card with the balls to run it) but on the VE3D forums someone mentioned this to make Dawn naked:


    Well for you sickies looking for the nude patch, no need. Just change the fairy.exe file name to one of two things. Quake3.exe will make her nude, and 3dmark03.exe will make her nude and wingless.

    On some other forum (here) [nvnews.net] they talk about renaming some file to get rid of her erm.. cloths/leaves. :) Enjoy.
  • by fluxrad ( 125130 ) on Friday May 23, 2003 @08:41AM (#6023218)
    From what I understand, the FX5800 is a crap card. A review at tom's hardware [tomshardware.com], however, shows the FX5900 pretty much beats the crap out of anything ATI has right now. I'm sure this will change with the next iteration of hardware, but hey - it always does.

    Either way, we should stop talking smack about nVidia when the best card on the market pretty much depends on when you're looking for it ;-)
  • Pointless (Score:3, Insightful)

    by Junks Jerzey ( 54586 ) on Friday May 23, 2003 @09:12AM (#6023444)
    3D graphics are still very much a niche on the PC. This may change with Microsoft's plans to do something like Apple's "Quartz Extreme" in a future version of Windows, but at the moment there are still only a relative handful of games that even require a hardware transformation pipeline (available since 2000), and there are even fewer that do anything at all with programmable shaders (available since 2002). At the same time, the slide in the PC game market continues. A lot of people, including myself, expected it to turn around a bit by now, but no dice. What the PC does have is a couple of games that will be hig with hardcore gamers: Doom 3 and Half Life 2. In a lot of ways, nVidia and ATI are designing cards specifically for those games, and not the perceived 3D market in general.

    In short, the race for the high-end video card market is increasingly meaningless, especially with the growing shift away from desktop PCs and the ridiculous power consumption and level of cooling required for high-end cards. If GeForce 2 class chipsets start shipping in an all-in-one, cool running, silent PC, then the real goal has been achieved. Gamers and CG people who want or need to blow $400 on a new video card + 10lb heat sink combo every few months can still do so. That's not a mass market industry any more, though.
  • Big deal (Score:4, Funny)

    by donscarletti ( 569232 ) on Friday May 23, 2003 @09:28AM (#6023561)
    ATI compatibility, big frigging woop!

    If anybody wants to do something actually useful they can get rid of those frigging fig leaves!

  • by sootman ( 158191 ) on Friday May 23, 2003 @09:56AM (#6023789) Homepage Journal
    ...posting links to a sit with high-res pics of a hot virtual babe, and the server got slashdotted? I'm a-fucking-stounded. Didn't see that coming.
  • by baxissimo ( 135512 ) on Friday May 23, 2003 @10:19AM (#6023985)
    The article claims the image quality was improved, but the first thing I noticed when I opened up the one of the 4xAA screen shots was that the antialiasing on Dawn's hair isn't working properly in the ATI version. The NVIDIA demo uses the GL_SAMPLE_ALPHA_TO_COVERAGE_ARB feature to feather out the hair at the ends. (This makes the card use the alpha value of a fragment to determine the number of covered samples when multisample antialiasing is on.)

    Check out the hair on this NVIDIA shot [nvidia.com]

    and compare with the hair in this shot off the ATI [rage3d.com].

    The zoom on the NV example isn't quite as close-up, but you can still see how the hair feathers away at the tips, while the hair on the ATI is all jaggy and uniform thickness right out to the tips. Like some kind of Raggedy Anne yarn hair.

    So does the ATI not support alpha-to-coverage? Or did these guys making the wrapper just not translate it properly? The hair looks pretty bad without it.

    I seem to remember when I saw the NV guys give the demo that they made a point of how expensive the hair was to render. Each hair is a separate GL_LINE_STRIP, there are thousands of them, and if you turn off the complicated blending and smoothing on each one of those little hairs, that could maybe add up to a large savings. I don't know, but I wonder if that would account for most of the speed-up they see running it on ATI.

    Still a neat hack. It would really rock to have complete NV<->ATI compatibility dlls that would work for all apps and not just this demo.

    While they may have made some slight improvement to one aspect of image quality by improving normalization, which I guess makes the lighting a little more accurate, I really doubt the improvement is all that noticeable. Maybe I could tell given a side-by-side comparison, but I doubt I would notice if only shown a version normalized one way or the other. On the other hand, that ugly hair is pretty obvious. To me that makes the NVIDIA sreen shots look better.

  • by Brat Food ( 9397 ) on Friday May 23, 2003 @11:17AM (#6024596) Homepage
    I was excited to try this out last night, and it runs beautifully, but after readon the comment about the hair (how its not shaded to kind of fade out), i also noticed that Sawn has no eyelashes, and aparently does on the nvidia cards. Having seen the demo on boh cards, i cant say technically why the ati does in fact seem to have a nicer overall image (is it the gamma, is it actually what rage 3d mentions?), but nonetheless, this is still a really, really cool hack. Now if we get the hairs back, can have an even match up~

How many QA engineers does it take to screw in a lightbulb? 3: 1 to screw it in and 2 to say "I told you so" when it doesn't work.

Working...