Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Graphics Software Hardware

ATi Radeon X1K Graphics Launched, Benchmarked 199

MojoDog writes "ATi has officially launched their all new Radeon X1000 family of 3D Graphics cards this morning and a full showcase with benchmarks of the entire line-up can be found at HotHardware. What may or may not be surprising to you, is the fact that the new high-end flagship X1800 is still a 16 pixel pipe GPU but now running at a blistering 625MHz. Is it fast enough to catch NVIDIA's 24 pipe GeForce 7800 GTX?"
This discussion has been archived. No new comments can be posted.

ATi Radeon X1K Graphics Launched, Benchmarked

Comments Filter:
  • What really matters (Score:5, Interesting)

    by Anonymous Coward on Wednesday October 05, 2005 @10:11AM (#13721861)
    How are the Linux drivers?
    • by bigtrouble77 ( 715075 ) on Wednesday October 05, 2005 @10:37AM (#13722092)
      The linux drivers have come a long way. I've been using the latest 32bit drivers with good success. On my mobile Radeon9700 I average 2500fps in glxgears in ubuntu. Maya seems to be working pretty well too, although I haven't tried any really complex scenes yet.

      In the last release ATI has a graphical installer which sorta worked, but I still had to compile the fglrx modules which would be a pain for a complete noob. It's too bad that the ati control panel is really only useful for configuring dual monitors and confirming that opengl is working. It would have be nice to have the plethora of opengl features the windows control panel has. You still have to edit the xorg.conf for a few things.
    • by theJML ( 911853 )
      I think ATI still believes that Linux is not a long-term viable gaming platform and therefore they are not coming out with linux drivers... Which is why I still run Nvidia cards on my linux boxes. They have the same unified driver base for linux as they do for windows and it just simply works. And since linux is what I use 95% of my time, I'm not buying ATI anytime soon.
      • I think ATI still believes that Linux is not a long-term viable gaming platform and therefore they are not coming out with linux drivers...

        Well, if it was about gaming only, you might be right, but Linux is most likely a long-term viable WORKSTATION platform. Graphics/CAD/ERP professionals need good stable drivers. ATi would ignore Linux at their peril.
    • by slummy ( 887268 )
      From experience, the proprietary Linux drivers that ATI provides aren't that great. They're still very buggy. I've had good success with the open source ATI drivers [sourceforge.net].
    • They suck, of course. Noticeably slower 2D than their OSS counterpart (try running a 2D intensive emulator, for example), doesn't support Xinerama like the X.org driver does, and you should consider your self lucky if you manage to get good framerates from the OpenGL screensavers in XScreensaver.
    • What matters even more to me than the speed of the drivers is the compatibility. I waited, and waited, for GLSL support in Linux from ATI. I finally get it and... the GLSL support is crummy. And I don't mean slow, I mean buggy. Some correct programs don't compile (at least then I know what the problem is). Others simply didn't run correctly (in Windows or on an nVidia card, no problem). Why bother to enable features in your driver that don't work?
  • by j-joshers ( 880017 ) on Wednesday October 05, 2005 @10:12AM (#13721866)
    I thought ATI was going to seize the advantage from Nvidia with these cards but from what the article is telling me it appears to be a GF5900-style bust. I was thinking the X1600 would've been exactly what I needed but I may just get the 6800GT instead. Oh well.
    • i guess we'll have to hope for a 32 pipe nvidia crushing r580 once the r500 for the 360 is all done and they can get back to work on nvidia ownage.
    • by Ford Prefect ( 8777 ) on Wednesday October 05, 2005 @10:44AM (#13722148) Homepage
      It does look like hot hardware to me - I think this is what the 'blistering' refers to.

      Just make sure your PC has adequate cooling and is kept away from flammable items!
    • I don't think it will be *that* bad. Other sites like DriverHeaven [driverheaven.net] are giving a marginal nod of the X1800XT over the 7800GTX even with the new Nvidia drivers. One thing that Nvidia has been fortunate about is by getting to market 6 mos earlier, they have 6 mos of driver tweaking to boost their performance. I trust ATI driver development more than Nvidia ,and suspect a similar 5-10% boost over the lifetime of the card will occur.

      At the lower end though you're right, the 1600 and 1300 models aren't very s
      • It's a little faster in some cases, and a little slower in others. It seems the old OpenGL/DirectX performance tradeoff with the ATI/Nvidia cards lives on strong. For all the extra time they had, I'm pretty suprised to see them release a part that is at best comparable to a 7800GTX performance wise and more expensive too. As for the headroom comment, you could just as easily say the nvidia cards have loads of headroom in terms of clockspeeds as they are pretty conservatively clocked right now.

        If they h
        • I agree with all of your comments. Ultimately I think this generation is the one to skip and wait until the r580 chipset due out early next year...that will be 32 pies. This is kind of a proof-of-concept generation IMHO.

          As for the price difference, I suspect ATI will milk the fanboi market for a month or two, then rapidly lower the price to compete w/ the 7800s.

          I just wouldn't count this as a failed generation. This x1000 series maps fairly well against 6800s and 7800s, whereas the 59xx series came la
      • Go get an original USB 1.1 TV Wonder. Out of the SIX machines in my house that work right now, only *ONE* (All computers are running the same OS, XP Pro SP2) will work with the damned device. I've had three ATi cards for graphics, and let me tell you, the driver support SUCKS. You might as well use the OMEGA DRIVERS. I couldn't even play Soldier of Fortune 2 at a decent framerate on a 1.7 mhz P4 with 256 megs of PC2100 DDR using an ATi 9800 card at more than 25 FPS with a brand-new XP install and using ATi'
  • Fast enough ? (Score:4, Insightful)

    by karvind ( 833059 ) <karvind@gm[ ].com ['ail' in gap]> on Wednesday October 05, 2005 @10:12AM (#13721867) Journal
    I will worry more about the drivers, especially for linux. Also ATI had some problems with supply of the chips in the last few quarters.
  • X1000?? (Score:5, Interesting)

    by Serengeti ( 48438 ) on Wednesday October 05, 2005 @10:13AM (#13721871)
    Why do naming schemes suck, anymore??

    X1000? I thought the "X" in "X800" was there because those video cards were the generation after "9800" and "9700"... Whats next, OSX11?

    Intel, AMD, ATI, nVidia... Mazda... they're all driving me nuts with their product naming schemes, lately...
    • Re:X1000?? (Score:5, Funny)

      by Iriel ( 810009 ) on Wednesday October 05, 2005 @10:20AM (#13721926) Homepage
      But at least they all have a long way to go to catch up to Motorola ^_^

      I can't wait for some XTRM GRFX CRD.....(suddenly disturbed by the likelyhood of such blasphemy)
    • Well, they could have continued w/ the Roman numeral theme and had XM instead of X1000, but I'm pretty sure that is trade marked.

      That is probably why naming schemes suck, the marketing department has to come up with a name no other company has ever used in order to get it through the legal department.
    • Well, the key problem is that in marketing speak, x is cool, but xi is wierd and hard to pronounce. ex eye? zi? No matter how you pronounce it, it just sounds wimpy compared to x. So once you reach x, what can you do? You pretty much have to shift to a different naming/numbering scheme.
      • Well, the key problem is that in marketing speak, x is cool, but xi is wierd and hard to pronounce. ex eye? zi? No matter how you pronounce it, it just sounds wimpy compared to x. So once you reach x, what can you do? You pretty much have to shift to a different naming/numbering scheme.

        not true, look at Xbox, what do they call the sequel to make it sound cooler? the Xbox360 like add a skating term best of all they can make the sequel to the 360 the 720 and the sequel to the 720 the 1080 and then the 1440 i
        • Ahh, but note that they didn't call it the xibox, and that they did not already have a part number involved. I suppose ATI could have called their new line the X800360, but how confusing would that look? And what's better, a x850360 or an x800720? It all gets very confusing. ;-)

    • Re:X1000?? (Score:5, Funny)

      by Quasar1999 ( 520073 ) on Wednesday October 05, 2005 @10:33AM (#13722062) Journal
      They don't have to make sense...

      Geforce 2
      Geforce 3
      Geforce 4Ti
      Geforce 5600
      Windows 3.1
      Windows 95
      Windows 98
      Windows 2000
      Windows ME
      Windows XP
      Windows Vista

      I'm simply trying to point out that nothing makes sense in the computer industry anymore... Hell my Dell Dimension 8100 PC is crappier than my Dell Dimension 4600. Explain that to me.
      • Re:X1000?? (Score:3, Funny)

        by Mawbid ( 3993 )
        Each Dell is crappier than the next?
      • Re:X1000?? (Score:4, Funny)

        by thebdj ( 768618 ) on Wednesday October 05, 2005 @12:11PM (#13722835) Journal
        Okay the GeForce Scheme makes sense. They just stopped calling them Geforce 1, 2, 3...The first number is obviously your generation number. The second number is the one you want to look at to guess-timate relative performance. This new one is a bit odd by ATI but it makes sense, X1***. The *** is what to worry about. This is Generation X1, or they could've said XI to possibly confuse you a bit less.

        As for the GP's comment, what makes you think OS 11 won't have some weird name like OS X1 or the slightly less weird XI? Look at ESPN games, they are sticking to that 2K* number system, though I guess that makes sense. You want a pointless and totally meaningless system, use the Pentium 4 system. I at least liked knowing by looking what the clock speed was, even if it did mean nothing.
        • "You want a pointless and totally meaningless system, use the Pentium 4 system. I at least liked knowing by looking what the clock speed was, even if it did mean nothing."

          You mean like the AMD system too?
          • The AMD system actually uses their "estimated" speed system. And is at least a fair comparison to intels clock speeds. It was designed to make processors with half the clock speed look competetive, because they were. The Intel System is just stupid and assigns a number that actually says nothing about the processor itself. Though I guess with ignorant consumers you will have what you will.
    • "X1000? I thought the "X" in "X800" was there because those video cards were the generation after "9800" and "9700"... Whats next, OSX11?"

      Actually I think the "X" in "X800" has more to do with it being designed for PCI-X. The same goes for the X300, which AFAIK is a much slower performer than 9800.
      • I think ATI already said that X800 was supposed to be the next generation version of the 9800, because X was 'ten' and they didn't want a five digit number, i.e. 10800. Windows XP is really Windows 2002 (I think), but it is really Windows NT 5.1.

        Marketing numbering is all over the place, it's hard to make sense of it. If you aren't paying attention you can lose track.

        Then there are the comical names, like MSI's naming NEO FSTR. I thought it suggested that it was a "new fister".
    • Keeping track of graphics card naming schemes is giving me a headache. I had an ATI 9700 Pro, and at the time the 9800 Pro was the fastest. There was also the 9500 and 9600, which were mid range cards, and something below.

      Then we hit the X series. Suddenly X800 Pro is the lowest of the high end cards, and there's an X800 XT (or something) which is the higher one. Now we've got XL, GT, GTX, X700s, X300s, and X1n00 somethings.

      I want a name that clearly indicates the generation of card (which '9', then 'X' and
      • This gives room above and below for expansion, and means that I know a card is faster, in that brand/generation, if the number is higher.

        The consumer is not supposed to know. You remember the SE editions of 9xxx series, or the nVidia mx cards?
        You're supposed to buy them without knowing that they're ridiculously underpowered, contrary to what you'd deduce from the model numbers. Then you'll need to upgrade sooner.
        The high-end gamers always know, people like me suffer.
        I know, do your research prior to buying

    • OS X11? No. X isn't as bloated as emacs yet.
  • by neosake ( 655724 ) on Wednesday October 05, 2005 @10:14AM (#13721881) Homepage
    • Today's winner of the Roland Piquopalliehsasdflkj Honorary Blogger Self-Promotion award is: Mojo-Dog. Special kudos go to slashdot editors, who continue to accept slashvertising as real "submssions".

      In honor of this award, we are now providing a special slashdot effect to Mojo-Dog's server. Guess sometimes that self-promotion backfires, doesn't it?

  • At Last (Score:5, Funny)

    by Winckle ( 870180 ) <mark@@@winckle...co...uk> on Wednesday October 05, 2005 @10:15AM (#13721885) Homepage
    Now to reach into my bottomless pit of money!
  • Honestly... (Score:5, Insightful)

    by Shads ( 4567 ) * <shadus@@@shadus...org> on Wednesday October 05, 2005 @10:15AM (#13721890) Homepage Journal
    ... I'm not that impressed. Technically the product looks to be superior but performance wise it's not doing well... it seems more like a "dud" generation like the early fx series leafbl... graphics cards nvidia put out. I think the next generation of nvidia and ati cards are going to be much more interesting than the present generation. Have to wait and see though.
    • Re:Honestly... (Score:4, Insightful)

      by argStyopa ( 232550 ) on Wednesday October 05, 2005 @11:34AM (#13722525) Journal
      I'm definitely impressed. This is the cutting edge of gaming goodness, all pixelly and shaderly, I think everyone should go out and get them. ATI should aggressively price them to entirely redefine the video card market and seize share from Nvidia.


      Oh, no, I'm not impressed with this card. I'm impressed with the opportunity that ATI will jump totally on this bandwagon, thus reducing dramatically the prices for all their other cards that are from previous generations but offer nearly identical performance.

      W00t for the bleeding edge, and the price breaks behind the curve!

    • obviously you missed the fact that when gaming at 1600x1200 and are using 4x antialiasing and 8x anisotropic filtering that the x1800xt beats the shit out of the gf7800.

      if you're buying a 500 dollar card, are you seriously worried about benchmarks that are run without aa+af? this card even does HDR (hi dynamic range) plus AA, something that the gf7800 can't.

      this card is way more sophisticated and highly refined that the brute force 7800. the 7800 isn't bad but that this card can do with 16 pipelines what th
  • Excellent (Score:5, Funny)

    by Anonymous Coward on Wednesday October 05, 2005 @10:16AM (#13721894)
    I will remortgage my house in anticipation
  • No (Score:4, Interesting)

    by Anonymous Coward on Wednesday October 05, 2005 @10:20AM (#13721925)
    The problem is not the hardware it's the software. And our understanding of how to write decent software. ATI's drivers while 'fairly good' still suck horrible for some rudmentary taskst. For example ever seen how crappy a ATI mobil chip makes a video from your laptop to a TV set look?
  • Since this was "delayed" technology, I can see the logic of staying with the Radeon name, especially since they have put a lot of money into the recognition for the line. But in the short term, it just makes me say ho-hum to this new release. Now we'll just have to wait for NVIDIA's next shot across their bow.
  • Usable links (Score:1, Insightful)

    by Anonymous Coward
    One-page review without a bazillion flash ads:
    http://www.hothardware.com/printarticle.cfm?articl eid=734 [hothardware.com]

    Coral cache:
    http://www.hothardware.com.nyud.net:8090/printarti cle.cfm?articleid=734 [nyud.net]

    Slow Down Cowboy!

    Slashdot requires you to wait between each successful posting of a comment to allow everyone a fair chance at posting a comment.

    It's been 12 minutes since you last successfully posted a comment

    Chances are, you're behind a firewall or proxy, or clicked the Back button to accidentally reuse a form. Please tr

  • by Anonymous Coward
    Since they are not doing an AGP version (which they were initially going to do) of the 1800 I could care less. ATI has forced me back towards Nvidia
    • by theantipop ( 803016 ) on Wednesday October 05, 2005 @10:36AM (#13722090)
      ...who incidentally isn't producing AGP versions of their cards either. The way it stands, ATI has the fastest (and highest number of) AGP cards to offer.
    • They are going to have AGP versions of the X1 600 and X1 300. Just not the X1 800.
      Just as a side note, before you jump back to Nvidia, there won't be an AGP version of the 7800 series either. (At least one is not planned, some industrious company may build their own using Nvidia's bridge chip, just as some one may do that for ATI, it just won't be anytime soon).

      Right now you AGP choices are probably going to end with 6800, or x850 being the fastest. Sorry but us AGP users are pretty much now considered a
  • Leafblowers (Score:4, Interesting)

    by jandrese ( 485 ) * <kensama@vt.edu> on Wednesday October 05, 2005 @10:29AM (#13722010) Homepage Journal
    Well, it had to happen sooner or later. The X1800 engineering sample card pictured in the article is double high and has a giant blower on the top of it. I wonder how long it'll be before we get a card that comes with an external fan attachment that you have to hang off of the back of your case?
    • I think this is just an issue with version 1 releases of new heat making cards - wait until the third party manus get ahold of the chips and put more efficient (space and energy) cooling systems on them
    • Yeah this isn't exactly a first. Some of the higher end FX cards (well at least one) had a dual slot config for their massive cooling. As I recall ASUS was one of the first OEMs then to release a single slot cooled card, and eventually others do too.
    • You think the cards are going to be able to stay inside the case?
    • Already happened, sorta. There's a silent GFX cooler by Thermaltake called the Schooner [thermaltake.com] that uses a heatpipe to extend cooling fins outside of the back of the case via an empty PCI slot. Quite nifty IMHO, although I don't like the idea of a fairly fragile piece of metal next to big hulking VGA/DVI connectors.
    • I wonder how long it'll be before we get a card that comes with an external fan attachment that you have to hang off of the back of your case? Don't worry. Soon all die-hard PC gamers will easily be identified in the open public by their installed ATI/nVidia heat exchanger box setting right next to the AC compressor unit outside. Or, it will just be hung out the window.
  • My 2 cents (Score:5, Insightful)

    by GFPerez ( 683823 ) on Wednesday October 05, 2005 @10:31AM (#13722030) Homepage
    My 2 cents: there are two key aspects: 1) Price; 2) Availability. 1 - If the price's too high, it would be very difficult to convince people to buy a high-end card with almost the same performance that a $100-less card (7800GTX). 2 - Remember that the X1800XT will be available only middle-November, which gives nVidia a lot of time to think how to counter-attack with something like a 7800Ultra.
    • yeah, like releasing yet another "optimized" driver...

      and by optimized i mean reducing visual quality in the drivers and cutting corners behind the users back.

      ati is guilty of it too but not nearly to the same degree that nvidia is.

      that's one reason why i will never purchase an nvidia card while ati or another manufacturer still produces cards.

      nvidia is well on its way to being another intel or microsoft in regards to ethics. they're getting there but they still have a ways to go.
  • by dauthur ( 828910 ) <johannesmozart@gmail.com> on Wednesday October 05, 2005 @10:35AM (#13722075)
    "What may or may not be surprising to you, is the fact that the new high-end flagship X1800 is still a 16 pixel pipe GPU but now running at a blistering 625MHz. Is it fast enough to catch NVIDIA's 24 pipe GeForce 7800 GTX?"

    Most people are worried about price, availability and not what counts with ATI cards nowadays: Power. I bought an X850 AGP and the power requirements are absolutely ridiculous. Surely, my Antec 550w can handle it, but it's completely unnecessary, as shown by nVidia. I don't like the idea of having to put aside an extra $10 a month to power my graphics behemoth, although I do love the performance.
    • Here are some power consumption figures:

      http://techreport.com/reviews/2005q4/radeon-x1000/ index.x?pg=16 [techreport.com]

      We measured total system power consumption at the wall socket using a watt meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The idle measurements were taken at the Windows desktop, and cards were tested under load running a loop of 3DMark05's "Firefly Forest" test at 1280x1024 resolution.

      7800GT: 112 / 204
      X1800XL: 144 / 207

    • actually the power requirements are on par with nvidias.

      both require obscenely high amount of power.. on the order of 200 watts at full load.

      and the official specs call for a 450 watt power supply on a fully outfitted system (according to ati).

      http://www.ati.com/products/RadeonX1800/Products.h tml [ati.com]

      if you want less power requirements, get a lesser card... but then you'll be trading performance for power consumption. it's a choice...
  • And you thought two gpus's were hot? Well not anymore with this new motherboard hotty (with pics) [tomshardware.com] supporting not 2 or 3, but 4 (OMFG!) gpus via 2* SLI. Of course all this technowhoring glory comes at a cost, with 4 GPUS likely to force most average gamers into submissive bondage for a month or ten, not to mention what it will take to prevent such a toasty little box from going critical!

    ==Nuclear Power Now!==
    • Sure you can plug in four cards - but they won't all help with the 3D heavy lifting.

      From the article:

      "What is missing, of course, is SLI support. What would NVIDIA lose if it were to agree to Gigabyte's solution and release drivers that support SLI across four GPUs? Well, I guess there is not much to lose by being "first to market" here, especially since only very few people have the tremendous budget needed to go for a super expensive motherboard - we expect a minimum price of $250 - and as many as two dua
  • The best thing (Score:4, Insightful)

    by Bullfish ( 858648 ) on Wednesday October 05, 2005 @10:45AM (#13722156)
    If you're like most people and running a single 1280 by 1024 monitor resolution tops, the best thing about these cards is they make the top end of the previous generation cheaper. I can only see one of these cards (nvidia or ati) being a must buy if you are running 1600 by 1200 or multiple monitors. Especially as many games are frame locked at certain rates. My 9800XT still plays any game I throw at it just fine regardless of what the hardware sites say. Between the two manufacturers, it's a matter of preference regarding the image quality. Me, I think ati is a little sharper, but that is subjective.
    • Well, I think it is pretty safe to say that the target audience for these cards is high end gamers willing to spend ~$2000/year on game playing hardware. Once you're in that target market, you surely do have either a 1600x1200 or 1920x1200 monitor, in which case you really do want something faster pretty badly. And with LCDs getting their response times lower and lower (enabling higher frame rates), these cards are (as usual) going to become useful for an ever growing percentage of the gaming population.
    • 9800XT too

      Yea, it's done me right.

      I will say Doom3 has noticeable slowdown with more than 2 enemies onscreen, though. The bottleneck has still been my CPU and RAM for every other new game.

      When these new cards came out after the 9800 and 5900 series were released (X800, etc.)I was scared I wasted my money looking at the performance improvements.

      But, it takes so long for the average PC gamer to upgrade his card, software developers don't completely take advantage of the hardware.

      I think people put too much e
      • I liken it to athletes racing. Usually it all comes down to tenths or less of a second. Same with these cards. Buy what looks best to you because the performance differences are negligible. Very much like CPUs hitting the wall vis a vis performance. Now their only answer is to go dual core, quad core etc. Until someone makes a super breakthrough, this is the way it will be for quite a while I would think.
  • Where's the AGP?! (Score:3, Insightful)

    by cavemanf16 ( 303184 ) on Wednesday October 05, 2005 @10:47AM (#13722165) Homepage Journal
    Some of us are still humming along on our AGP 4x/8x AMD64 mobo's with plenty of RAM to spare. Where are the new graphics cards for us?!?! nVidia and ATI are in some damn war over their latest, greatest PCI Express cards while they pay little attention to providing cards built for AGP card slots. This, quite frankly, sucks. I'm not a freak about buying every new graphics card that comes out, but it's getting to the point where it's about time to upgrade (so I can enjoy more features of HL2's DoD:Source HDL tweaks) and you simply can't [newegg.com] buy [newegg.com] an nVidia 7800 card for an AGP slot. If I'm going to spend twice as much on a video card than any processor I've purchased in the last 5 years, it better be the best I can get right now so that it lasts me for a long time to come, but alas, no such card is made for my mobo! Where's the love, graphics card companies?
    • Buy an X850XT. It's just about as fast as the fastest cards out in most cases, plus it should see a bit of a price drop into the $300 territory in the next few months. If you are so interested in dropping $500 for the latest video card, maybe a $75 PCIE motherboard wouldn't be too much for you to consider.
    • I think they pretty much expect you to upgrade your motherboard ($50-$100) more often than you upgrade your video card ($500). That seems like a pretty safe bet to me. Why not just purchase a more modern motherboard and transplant everything if you really are ready for a big new video card purchase. If your CPU/memory is sufficiently out of date that you can't, your system probably couldn't make good use out of the video card upgrade anyway.
    • Re:Where's the AGP?! (Score:3, Informative)

      by antime ( 739998 )
      Keep in mind that these are only ATI's reference cards. The actual chips are compatible with AGP bridges, so it's almost certain some card manufacturer will make AGP versions of these. It's only a question of when and for how much.
  • Who cares? (Score:2, Interesting)

    Is there some current game that demands that much performance?
    Once you're gaming at 100fps 1600x1200 with all the bells and whistles, why do you need a new card (or pair of cards)?

    Is there some game in the pipes that will actually use all the fancy features that these cards have?

    • I have a 19" Sony X-Black TFT, which is quite responsive, reasonably large display and has really vibrant colour reproduction, I prefer it to my 19" CRT as I can use it for long periods without feeling fatigued. This means I never play above 1280x1024 (and at 60 FPS, as I like to play in vsync with the display to prevent tearing, which on a TFT is enough FPS for anyone).

      I am primarily concerned about image quality, which means Anti Aliasing (4x or higher), Anisotropic filtering (x8 or x16) and smoothing on
      • Yeh. I have a similar problem with bad shadows and reflections, and "infinitely thin" transparent surfaces that don't implement refraction. Mostly, I can ignore them, but sometimes they're really glaring and while a developer can write special-case code to deal with individual problems (raycasting shadows, for example) it's not automatic and they inevitably forget about some of the places where there should be shadows, reflections, caustics, or other optical effects that just fall out automatically in raytr
    • Where did you find a card that can handle 100M polygon scenes with HDR lighting and fully-procedural textures, at 100fps at 1600x1200?
  • Apparently... (Score:2, Redundant)

    by christoofar ( 451967 )
    it's not fast enough to serve up web pages for their performance stats (/.'d)
  • Paper Launch Part 2 (Score:3, Interesting)

    by DeadBugs ( 546475 ) on Wednesday October 05, 2005 @11:32AM (#13722505) Homepage
    It's been noted on many sites that these cards are not available for sale yet. ATI has been getting hammered lately over their decision to "paper launch" crossfire, while telling review sites that they would be in stores. Don't expect to see these cards for at least a month.

    http://www.anandtech.com/video/showdoc.aspx?i=2550 [anandtech.com]
  • Never Again ATI (Score:4, Interesting)

    by Greyfox ( 87712 ) on Wednesday October 05, 2005 @12:06PM (#13722793) Homepage Journal
    It took 8 months from my last PC purchase for them to support PCIE on Linux (If I'd gone for a slightly less "high end" model with nvidia it would have worked from day 1) and their latest quirk is that if you install their latest X.org firegl driver on Debian by using alien --to-tgz, then detarring the tar file at the root level, it'll change permissions on every directory it writes in to to 0700. You may then find that your regular user account can't, say, run ls. Fortunately fixing that isn't too hard once you figrued out what caused it. I'd file a bug with them but you have to register on their web site and "Debian isn't supported." It'll be a cold day in hell before I put another piece of ATI hardware in one of my systems.
  • No mention of a fanless version for those of us with water cooling.

    Is that too much to ask of card manufacturers?

    I broke my last ATI card trying to prize their glued-on fan from the GPU, and don't want to do the same with my next purchase.

  • Guys, H264 is already shipped and there are example videos at Apple. We, mac users were first to try it and people even running 2.3 ghz, dual g5 monsters lived what you lived trying to view vcd class mpeg on Pentium 75 :)

    That h264 will dominate since both competing blu ray and Hd DVD decided on that format. Add the satellite feeds, digital terrestrial TV too and you will really want something that does most of the work on gpu.

    After seeing amazing quality and excellent compression, I am even looking for a ch
  • Who cares about FPS without knowing what the actual (theoretical) millions/billions of instructions per second it can handle? That's the main deciding factor in computers/processors. It is not the megahertz/gigahertz that matters. If in one clock cycle a processor can computer over 100 MIPS, then at 625 it'd be doing 62,500 MIPS. No graphics card does that yet.

    Are we all still ignorant to the "More MHz is better" syndrome? Please don't tell me people are still encumbered by this FUD, *ESPECIALLY* on /.

"Don't worry about people stealing your ideas. If your ideas are any good, you'll have to ram them down people's throats." -- Howard Aiken