Forgot your password?
typodupeerror
Graphics Hardware

Budget Graphics Card Roundup 186

Posted by samzenpus
from the looking-good-for-less dept.
Anonymous Coward writes "Not all of us are prepared to drop $500 for a killer graphics card. Generally, the sweet spot in price and performance is in the budget category of GPUs. Joel Durham Jr. over at ExtremeTech reviews nine current graphics cards, all of which are below $250, some below $150, to determine which cards are worth the time and money for the gamer on a budget. In the sub $150 category, the ATI Radeon 4770 performed the best for its price. Spend a little more and Joel recommends the GeForce 260."
This discussion has been archived. No new comments can be posted.

Budget Graphics Card Roundup

Comments Filter:
  • by Anonymous Coward on Wednesday May 20, 2009 @08:47PM (#28034081)

    ...first posts?

  • Wrong... (Score:5, Interesting)

    by Schnoogs (1087081) on Wednesday May 20, 2009 @08:50PM (#28034119)

    ...the sweetspot is in the mid range. The budget cards are only good for playing those 2-3 year old games you've been putting off forever. The midrange cards allows you to play the current games at modest framerates without having to break the bank. For $200 I can get a card that will play Crysis, STALKER Clear Sky, etc at a reasonable resolution. Try doing that with a budget card.

    • Re:Wrong... (Score:5, Insightful)

      by jeffmeden (135043) on Wednesday May 20, 2009 @08:59PM (#28034213) Homepage Journal

      No problem, take two Radeon 4770 cards ($100 each) on a crossfire motherboard and they will run circles around cards in the $200 range. Together they will use less power than the $200-$300 cards, too. See this [tomshardware.com] for more info.

      • Re:Wrong... (Score:5, Interesting)

        by Itninja (937614) on Wednesday May 20, 2009 @09:19PM (#28034421) Homepage
        Yeah that works. But the money you are saving on the card(s) will be more than eaten up by the need to a crossfire compatible board (i.e. one with 2+ PCI x16 slots). Mobos with only a single slot are less than half the price.
        • Re:Wrong... (Score:4, Informative)

          by log0n (18224) on Wednesday May 20, 2009 @11:22PM (#28035443)

          $235 for the SLI plunge here.

          I've got dual 9800 GTs with a Zotac SLI board for $55, each 9800 was $90. Free shipping with the Negg.

        • Re: (Score:3, Interesting)

          by interkin3tic (1469267)

          Yeah that works. But the money you are saving on the card(s) will be more than eaten up by the need to a crossfire compatible board (i.e. one with 2+ PCI x16 slots). Mobos with only a single slot are less than half the price.

          Also, uh, wouldn't two cheap memory cards for $100 be about the same as one of the "midrange" $200 memory cards in both performance AND cost?

          $100 x2 = $200?

          I don't know a whole lot about hardware, so maybe multiplication doesn't work the same inside a computer...

          • by Bazer (760541)

            Also, uh, wouldn't two cheap memory cards for $100 be about the same as one of the "midrange" $200 memory cards in both performance AND cost?

            Not quite. Two sticks of dual channel memory will increase performance in stream processing and memory intensive tasks. You won't notice the gain in day-to-day tasks but the difference is there. I'm not an expert, but GPU related work seems like it qualifies as both a stream processing and memory intensive task.

        • by Mattsson (105422)

          If one can accept to take a small performance-hit, you simply dremel-away the back-end of a x1-slot and maybe, if necessary, cut away part of the card-edge on the second card if there's components in the way. (I had to cut down my secondary card so that is only has a x4-pcie connector...)
          It's a bit hit and miss though. Doesn't work with all mainboards, nor with all graphics-cards and it's not always that crossfire and such will work, since the mainboard manufacturer might not have bothered with support sinc

      • Re:Wrong... (Score:5, Interesting)

        by winphreak (915766) on Wednesday May 20, 2009 @09:43PM (#28034667)

        I remember that when SLI was new.
        People would buy two mid range cards and it had enough kick to run everything for a few years at a decent rate.

    • Re:Wrong... (Score:5, Insightful)

      by feepness (543479) on Wednesday May 20, 2009 @09:04PM (#28034269) Homepage
      The summary mentions cards below $250.

      I think the problem is the definition is changing. $200 used to be in the lower quadrant. Now it is definitely mid-range. The high-end has dropped out as there is no point to be pushing X trillion pixels.
      • Re:Wrong... (Score:4, Insightful)

        by zippthorne (748122) on Wednesday May 20, 2009 @09:25PM (#28034473) Journal

        I haven't paid over $100 for a video card in 12 years. I've always been able to max out the settings in every game I cared to buy that was available by the time I bought the card.

        And in the first half of that period, I really cared about gaming and gaming performance. I'm sure Best Buy would like you to believe that $200 is a low end device, but you're seriously much better off getting a sub-$50 card now, and another sub-$50 card in a years time if you really need to.

        • by PopeRatzo (965947) * on Wednesday May 20, 2009 @09:55PM (#28034773) Homepage Journal

          Does anyone know of a video card that doesn't draw much more power than my old Radeon X1650 but is better? I want to upgrade one of my machines, but I don't want to replace the PSU. I'm holding out on a new system until the i7 machines start to come down in price and I see if Windows7 is worth bothering with.

          I actually like playing last year's games. I bought Far Cry 2 for 15 bucks on Steam (they were having some sale a few months back). I like to wait a while before shelling out for the new games because a surprising number of them tend to suck, and the real reviews don't start showing up until well after the release, when most reviewers are drunk on hype.

          The exception are the Half-Life 2 episodes. I buy those right away, hoping that Freeman is finally gonna bone that Alex chick. Now that would be some FPS I could get behind.

        • > you're seriously much better off getting a sub-$50 card now, and another sub-$50 card in a years time if you really need to.

          Not if you want to drive a large monitor over DVI. The cheap cards don't have dual-link DVI, so I'm stuck driving my 23" widescreen by analog because the DVI connection won't do 2048x1152 (native resolution). I didn't realize this was a factor when I bought the monitor to replace the 20" 4:3 I stupidly broke (which worked fine over single-link DVI at 1600x1200).

          I painted myself in

          • by Mr. DOS (1276020)

            ATI (well, ASUS) manufactures an AGP version of the Radeon HD 3450, however, I don't know how it stacks up to the AGP version of the 6200.

            Actually, for general comparison, I have a PCI-e Gigabyte Radeon HD 3450 in my work machine, and while I haven't run benchmarks, the PCI (vanilla PCI, not PCI-e) eVGA GeForce 6200 in one of my home machines seems to work better... it could just be drivers, though.

                  --- Mr. DOS

            • Re: (Score:3, Insightful)

              by Mal-2 (675116)

              I don't really care if it improves on the GPU speed, I just need dual-link DVI to properly drive a resolution of 2048x1152. Analog is annoying, though surprisingly adequate given the six foot extension cable in the signal path. It's slightly worse than it was without the extension, but it was worth it to exile the computer to another room (and keep all the goodies in here).

              Whether a card's DVI links are single or double is something that generally is omitted from reviews, much to my consternation.

              Mal-2

              • Re: (Score:3, Informative)

                by Mr. DOS (1276020)

                I poked around on Newegg a bit, and found the SAPPHIRE Radeon HD 3850 [newegg.com]. It's $100 plus shipping, but it has dual-link DVI and it's almost undoubtedly more powerful than your existing card (which is nice, even if you don't need it). The VisionTek Radeon HD 2400PRO [newegg.com] would probably work too, if you'd rather not spend $100, but there seem to be a lot of complaints about driver compatibility.

                Now I want one of the HD 3850's.

                --- Mr. DOS

          • Re: (Score:3, Informative)

            by Jeremy Erwin (2054)

            Radeon 3850 [tigerdirect.com]

            Radeon 2600 XT [tigerdirect.com]

            Another Radeon 3850 [newegg.com]

            • by Mal-2 (675116)

              Thank you, that was exceptionally helpful, especially as your links clearly state dual-link DVI.

              Mal-2

              • Re: (Score:3, Informative)

                by Mycroft_VIII (572950)
                Be carefull with the 3 and 4k series, they were designed for pci-e only (not shure, but think the 2k series is the same) and they only do agp with a bridge chip of some sort and there have been issues, and ati doesn't support those configurations (you're pretty much stuck with the vendors custom version of the radeon drivers).

                Mycroft
                • Upgraded a friends PC to an AGP 3850 and have seen absolutely no problems with it. Perfectly happy playing games and blu-ray films - for the money it's actually a pretty decent card (It was the sapphire one from the first link). It's one of the only choices available to get DX10 on an AGP system (apart from the 3870 obviously)
              • There's also the ATI 3650 as well. The 3850 would be better though....
          • by drinkypoo (153816)

            Unless you're seriously pushing the limits of your RAMDAC, or you just have a shitty one, analog is not any serious limitation. Certainly, any decent VGA cable has more than enough bandwidth to carry 1080p. Seriously, you can do a fine job of carrying a 1080p signal over component cables and that's just three pieces of coax... which of course is what's attached to any high-end video cable's R, G, and B pins (and maybe syncs, too.)

            • by Mal-2 (675116)

              This setup is performing adequately with a 6 foot extension, so I'm not saying analog is intolerable. I just see the pulsing of waves in what should be pure black, and just enough horizontal smearing to annoy me. It's good enough that I'd rather run analog at native resolution than DVI at non-native resolution. Still, $99 for a recent Radeon, plus the cost of a new extension cable (or just one long one, no daisy-chain), is not at all unreasonable and leaves open the option of a second 23" Samsung 2343BWX. A

        • by antdude (79039)

          Max out settings? Which games? Crysis? Far Cry 2? Which screen resolution? 800x600?

        • by master_p (608214)

          If I need to play the new games now, instead of 5 years later, then I don't think paying $100 more now is very important.

    • Re: (Score:2, Interesting)

      by Itninja (937614)
      You know, just because it's not a FPS, doesn't mean it's a "2-3 year old game". Most of the games I play came out less than a year ago. In fact, I have a beta of StarCraft III (I know a guy) running on my system now that looks great with my GeForce 9500gs 512MB card. I think it was like $150 if that.
    • Re:Wrong... (Score:5, Interesting)

      by iamhassi (659463) on Wednesday May 20, 2009 @09:40PM (#28034645) Journal
      "For $200 I can get a card that will play Crysis, STALKER Clear Sky, etc at a reasonable resolution. Try doing that with a budget card."

      RTFA [extremetech.com] Crysis, high settings, 1680x1050... 32.7 fps from the $100 Radeon 4770 [newegg.com]. Anyone want to argue that 1680x1050 isn't a "reasonable resolution"? And remember this was a benchmark, so no doubt there were 100 guys on the screen moving and shouting and explosions and all that stuff that never really happens when you're playing normally, crouching behind a tree trying not to be sniped.

      If that's not enough, spend another $100 and run 1900x1200 at 43fps [tomshardware.com]

      And we haven't even touched the 20% fps gains from overclocking [tomshardware.com]: "At 1680x1050, with 4xAA, you're looking at a greater-than 20% boost - nothing short of incredible."

      Yes, I bought one and it's amazing for $100. Wonder what I'll be buying in 2-3 yrs? A $70 card?
      • by Rockoon (1252108)
        It is most definately the drive for higher resolutions that has pushed the video card market over the past 2 years. If you have a regular old budget 5:4 or 4:3 range LCD display (1.5 megapixels or less), the 512MB 8800GT is still most definately good enough for any game that has come out since its release (This includes Crysis, L4D, DeadSpace, Far Cry 2, Mirrors Edge, etc..) .. thats with all options on the highest allowed setting.

        (normally I turn texture resolution down in order to minimize load time bet
    • Crysis and Stalker clear sky will both run quite well on an hd3870, a card that cost just over $200 when it was released years ago. The 4770 will play them well also.
    • As a consummate geek, it is not like I, like, talk to anyone else, so they won't razz me for being "out of it". Also, I am old enough to tell where to stick it if they do. Anyhow, "old" games that I haven't played before, are "new", to me.

      So, if I buy 3 year old games out of the bargain bin, I am good-to-go.

      The latest, bleeding edge, tech needs time to debug anyhow.
    • by drinkypoo (153816)

      Shit, I wish I was only putting off games for 2-3 years. On the plus side, I'm still getting mileage out of games ten years old, and more! I occasionally dust off dosemu for Populous 2. I only wish Populous 3 had been somehow related.

    • by Aceticon (140883)

      If all you got is an LCD monitor with a natural resolution of 1280x1024 (typical for 17-inch ones) then even the cheapest of the graphics boards tested in the article can run Crysis with all settings at max at the best resolution your monitor supports.

      Until recently I was running World in Conflict @ 1280x1024 with a 7800 GTS (a high-mid-range card from about 3 years ago) with no problems and it looked great.

      My experience of 12 years of gaming in the PC with 3D graphics cards is that, while in the past games

  • by carp3_noct3m (1185697) <slashdot&warriors-shade,net> on Wednesday May 20, 2009 @08:55PM (#28034173)
    As a long time PC gamer, I have come to the conclusion that there are only two reason to upgrade your video card. 1) A new fancy game you must play at high settings to enjoy needs more juice from your rig. 2) You find a good performance to price ratio card that fits your gaming needs. I tend to upgrade about once every year or year and a half. I am currently still running on the BFG OC 8800GT (for $200 in Dec 07). I play everything from the good old stand by counter-strike: source, left 4 dead, call of duty 4, far cry 2, ut3 and many more to the non graphically intense without so much as a hiccup. (I am always looking out for a new game that is worth my money and though I tend to stick to FPS I still like RPGs and MMORPGs and even the occasional RTS) Graphics != Good gameplay.
    • by Anonymous Coward on Wednesday May 20, 2009 @09:13PM (#28034361)

      3) Your current card fails.

      My budget card from 2005 recently started producing artifacts during light use and failing in bigger ways during heavy use. It had served me well. I was unable to play some modern games (e.g. BioShock) but there are so many interesting older games that I still haven't had time to play. It seems like what I gain from the price of a video card diminishes as the selection of games grows.

      • by cskrat (921721)

        What card couldn't play BioShock for you?

        I enjoyed it with my 7900GT on an Athlon64 X2 4600 (939 Socket)

      • by atamido (1020905)

        Usually when this happens to mean it means it's time to clean the dust off of the video card's heat sink. A bit of dust clogging things up lets the card overheat, causing artifacts.

    • Re: (Score:3, Informative)

      by BikeHelmet (1437881)

      I have to agree with you. I always wait for the good deals.

      I have an 8800GS, which I picked up for $45 in October 08. Just recently I spied a 9800GT for $60. I was tempted to buy it, but decided not, since all my current games still play fine.

      Left4Dead sure is fun when you get a good team together in Versus. :D

  • by Itninja (937614) on Wednesday May 20, 2009 @08:56PM (#28034177) Homepage
    Here's the single page link: http://www.extremetech.com/print_article2/0,1217,a%253D240530,00.asp [extremetech.com]
  • by gun26 (151620) on Wednesday May 20, 2009 @08:58PM (#28034197)

    As the other components in a PC got steadily cheaper, video cards seem to have stayed stubbornly pricey until recently. But that's changing very fast. I'm astounded by the price/performance breakthroughs we've seen over the last year or so. AMD/ATI deserves full marks for taking the lead on this stuff lately, especially in using a 40 nm process for their GPUs and passing the resulting savings on to the customer.

    Too bad that as a Linux user, I can't really consider running ATI video since their binary drivers seem to be of considerably lower quality than the ones turned out by their arch-rivals at Nvidia.

    By the way, another great article on these new cheaper video cards is at Tom's Hardware: http://www.tomshardware.com/reviews/radeon-geforce-graphics,2296.html

    • by Hatta (162192) on Wednesday May 20, 2009 @09:23PM (#28034451) Journal

      Not cheap enough. Seriously, $150 is a budget card? Hell you can buy an Xbox 360 for that. If I were to buy a $150 video card it would be the single most expensive component of my computer.

      If you're on a budget, and you care about value, you'll get a lot more bang for your buck by simply turning down the quality settings. After all, it's about the game play right?

      • Re: (Score:2, Insightful)

        by gun26 (151620)

        No, $150 is more midrange than budget, at least in my book. In the Tom's Hardware article I cited, they mention an ATI Radeon HD 4670 for $65 and an Nvidia GeForce 9600 GT for $80. Those are today's budget cards. I've had a 9600GT myself for a little over a year now and it gives me all the performance I need. I paid considerably more than $80 for mine a year ago - the price drop wouldn't have happened without the stiff competition from the HD 4670 and other ATI cards. The point is that we're getting a lot m

      • by bh_doc (930270)

        After all, it's about the game play right?

        FFFHAAAHAHAHAHA!! /Publishing executive.

      • Re: (Score:3, Insightful)

        by TheLink (130905)
        Most of us here would already have a PC. That's a sunk cost.

        So the options are:
        a) Buy a game console for game console games
        b) add a video card for 0.5 to 1x the price to be able to play PC games.

        As you can see, it boils down to whether you prefer console games or PC games.
      • by PingSpike (947548)

        I agree that $150 is not "budget" and I own a card in that price range. (or, was in the price range) However, there are not cards in the $50-70 range that are actually pretty great. In the past, anything under $100 was more or less a complete waste of time. $50 was the bottom rung of graphics card and usually got you some kind of over neutered piece of crap that was barely better then onboard video expect for more frequent driver releases. As mentioned by another here, ATI and Nvidia both offer pretty good

    • by CodeBuster (516420) on Thursday May 21, 2009 @12:35AM (#28035861)
      Or as Richard Stallman says, "Don't buy from ATI, enemy of your freedom [fsf.org]"...
  • by StarHeart (27290) * on Wednesday May 20, 2009 @10:22PM (#28034985)

    I just bought a eVGA GTX260 216(core) SC at Fry's for $200+$20 tax. But it had a mail in rebate for $50. Which will bring the price down to $150+$20 tax. I bought it not as a gaming card, but as a second CUDA card. I already had a PNY GTX260(192 core).

    CUDA doesn't play nice with regular graphics usage. Your machine will be really jerky every few seconds. I also didn't have room in my main computer, motherboard or power supply wise. So I put it in my second desktop that I use for iSCSI and a third monitor via synergy. The machine already had a 6600GT, which then became the secondary card. I run X off it. Which leaves the eVGA card just for CUDA. Then I can run it all day and not even notice a performance hit.

  • by dinsdale3 (579466) on Wednesday May 20, 2009 @10:47PM (#28035185)
    Tom's hardware does a best graphics card for the money every month with a breakdown for various pricing tiers. It also has a hierarchy chart that groups cards by performance levels, which helps to compare different models other than the "best" for each category.

    Here's the one for May. http://www.tomshardware.com/reviews/radeon-geforce-graphics,2296.html [tomshardware.com]
  • by Anonymous Coward on Wednesday May 20, 2009 @10:51PM (#28035227)

    Geforce 260... that can't be much better than Geforce 256, can it?

    (Hint: Maybe it's around time NVidia thought a new product name.)

  • Budget? (Score:5, Insightful)

    by sc0ob5 (836562) on Wednesday May 20, 2009 @11:28PM (#28035479)
    Maybe I have a different opinion on what budget is.. Less than $100USD.. Here I was thinking that I'd read a hardware review on slashdot that may actually be useful to me. Alas no.
    • by voss (52565)

      If you dont play crysis and your not running vista an nvidia 9800 gt ($100) will be good enough to run pretty much any game at hi rez, I run mine at 1920x1080

    • by Spatial (1235392)
      Radeon HD4770, Geforce 9800GT or 9600GT.

      They should be available at or below 100 dollars and will play pretty much any game nicely.
  • by sa1lnr (669048) on Thursday May 21, 2009 @03:37AM (#28036681)

    if they tested "budget" cards on a "budget" system.

    I'm sure lots of people that buy i7/X58 with 6GB of DDR3 put budget cards in their top end system. ;)

    • by Spatial (1235392)
      That would be stupid. The computer isn't being tested, they're only interested in the GPU. The point of the high end components is to eliminate any other bottleneck except the GPU so that their performance results are directly comparable. Otherwise the comparisons would be unfair or misleading.
  • by RAMMS+EIN (578166) on Thursday May 21, 2009 @04:48AM (#28036945) Homepage Journal

    While on the subject, I would like to ask a question. Obviously, I could do the research myself, but someone probably knows the answer from the top of their heads. So here is the question:

      - Is there any current graphics card that sells for under 100 USD, and has open source drivers that allow decent gaming? Preferably passively cooled.

    I have a GeForce 6600 (passively cooled) now, which I am happy with in terms of performance. But that's using the closed source driver. With Intel, VIA, and AMD having open source accelerated 3D, is there a video card I can buy now that has the same or better performance, but using open source drivers?

  • Maybe I'm missing something but I bought a Sapphire Radeon HD 3650 512MB card [ebuyer.com] for £57 (about $89) and I consider that budget.

    So far it's been able to handle most of the stuff I've thrown at it, albeit not at the highest possible resolution - but then I've only got a 17" monitor.

  • Why bother spending $500 for a killer graphics card when you can get a killer hard drive for free by installing ReiserFS?

Aren't you glad you're not getting all the government you pay for now?

Working...