Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Graphics Software Entertainment Games Hardware

Benchmarking Your GPU with F.E.A.R. 37

ThinSkin writes "Monolith's new shooter F.E.A.R. is all fun and games, but it can also be used as a benchmark to test your GPU's performance. ExtremeTech's Jason Cross goes into detail on benchmarking your GPU with this graphically-intensive game. In addition, the article also tests the performance of high- and mid-range cards from ATI and Nvidia to see which scores top marks." It's a tough game; I had to buy a new rig.
This discussion has been archived. No new comments can be posted.

Benchmarking Your GPU with F.E.A.R.

Comments Filter:
  • mid range system (Score:5, Interesting)

    by rwven ( 663186 ) on Thursday November 10, 2005 @02:57PM (#14000118)
    I found that if you have a midrange system you can turn everything to max, but keep volumetric lighting and all shadows turned off and the framerate stays pretty constant no matter what other settings are enabled or at what quality. The only thing i saw other than that, that hit my performance significantly was enabling full quality textures because i don't currently have enough ram ("only" a gig) to keep all the textures in. The HD swapped a TON with textures all the way up.
    • ...define mid-range system. i've got a p4 2.6, 1gb pc3200, and a radeon 9600se. does that cut it for the kind of down tweaking you're talking about?
      • Hmmm... your system is similar to mine.

        P4 2.6Ghz w/ HT
        1 GB RAM pc3200
        Radeon 9800 Pro 128MB

        I have to run F.E.A.R. at 640x480 with most of the settings on minimum. Even then I get really low frame rates. I have to make sure I turn off as many processes as possible too. I thought I was running a decent system but I'm feeling that upgrade itch pretty bad right now. I've been researching Nvidia SLI solutions at the moment. Apparently, the F.E.A.R. developers had SLI in mind and it should autodetect it and o
        • Odd.
          Here is my system followed by my F.E.A.R. settings:

          Shuttle an35-N-400 (not a dual channel board).
          Athlon XP 2000.
          Radeon 9800 Pro 256MB.
          1GB Patriot 400 MHZ DDR ram (2 DIMM @ 512MB each).
          Fear installed on raid 0 stripe on Highpoint PCI card.
          OS drive and DVD drive each on their own IDE channel from the chip set.
          on-board sound.

          I was able to run F.E.A.R (with no slowdown in large open areas) at 1024x768 medium graphics settings and 4x AA 2x ansitropic filtering.
          Shadows were set to minimum, I usually turn them
        • Hmmm, I have the same video card, slower RAM, and a 2.4 Ghz northwood (no HT), and I can run FEAR at 800x600 with most effects on, but not the highest texture settings. It keeps a consistently playable >30 framerate, and I've got some background processes too (mainly IIS and MSDE). Now that is only on the demo; I wasn't amused enough to go buy it (though that laser that chars people down to skeletons was fairly entertaining, and I liked the design of that heavy soldier halfway through).
        • i know my generic 9600 isn't going to cut it, and SLI isn't really an option on my board, so i've been looking for a good 8x AGP card that won't kill my wallet. would a 6600 or 6600gt be worth the cash, or should i just wait until i've got the money to put together a half-decent SLI system?
      • Laptop: AMD 64 3700+, 1gb ram, 100gb HD, 100mb ATI Radeon X600. it stays pretty solid just under 30fps...which is still playable. peaks at about 45 or so, dips into low 20's for instances every once in a while... I found that by upping everything from medium to high and keeping textures at medium, keeping shadows off and keeping volumetric lighting off, the framerate only dropped about .5 fps. To be clear, i had shadows and volumetric lighting off ALL the time, not just when i cranked up the settings.
        • From your experience with a mobile X600, do you think a Turion notebook with an X700 would be worth getting? I've been seriously tempted by the new MSI MS-1029, but I don't know how well that card would run.
          • The card runs just fine. I wouldn't get anything slower than an X700 though. I would have paid a little more for that if i had the chance to do it all over again.
      • er, sorry, 128 MB ATI Radeon X600.... duh.
  • ARRG (Score:3, Insightful)

    by tomstdenis ( 446163 ) <> on Thursday November 10, 2005 @03:17PM (#14000362) Homepage
    There was a time when a game/program required an excessive amount of computing power that was a BAD thing.

    I don't look at a game that requires 350W of computing to run as a "good thing". Sorry, I just don't. Any VB.NET hacker can make an inefficient bloaty game. It takes real talen to do the same with minimal requirements.

    If this "F.E.A.R." game really requires a $500 graphic card to play then they can keep it. It's just a game, you'll play it and be bored within a week. Meanwhile you're still out the $500 and your computer is taking "yet more power" to run.

    These peeps really ought to develop games for things like a gameboy or PSP first. Then they'd get an idea of what "optimization" means.

    • Get over it. (Score:5, Insightful)

      by Shivetya ( 243324 ) on Thursday November 10, 2005 @03:39PM (#14000604) Homepage Journal
      With your type of thinking we would still be playing pong just in multiple colors. There is nothing excessive about the game requirements what is excessive is the whiners about how other people spend their time and money.

      There are always those who will try to guilt-trip anyone for whatever reason. Most always it boils down to money. Like people who harp about how much gasoline costs, to hummers, to millionaires buying rides to the ISS.

      Enjoyment and relaxation come in many forms and how people spend THEIR money is of no real concern to me as long as it does not endanger me in the process.

      Computer games are advancing the state of entertainment, attempting to bring realism to the screen. Doing so does require oodles of computer power and we have that luxury these days. People looking at the future would never imagine the power we dedicate to games but looking back 10 years the picture changes.

      The amount of power expended by high end PCs is nothing to cry about. In fact it trivializes many other real wastes of power and money.
      • Re:Get over it. (Score:3, Interesting)

        by tomstdenis ( 446163 )
        Here's a tip for ya, the more you optimize the more you can bring out of the platform.

        You can make *even better* games by optimizing your resources. Sure you could double the texture resolution [4x the ram] but that only "improves" the game so far. Then you are left over with no memory for say, good AI or physics or whatever.

        Having 1600x1200 with 8xAA and 16xAF is not the be-all of gaming. It isn't "advanced" either. Look at SIGGRAPH if you're into "state of the art".

        "people like me" are why you have th
        • Ah, but he has a plan... when coal runs out he can burn his money to heat his home!
        • So, where do I get a PPC440 board/box that is at least as cheap as an AMD equivalent, with TV-output and firewire?

          I'm not being flippant, I'm just completely unfamiliar with the embedded PPC stuff, so I'm not sure what's available, what it costs, or where I can get it in single unit quantities. If I can get ahold of something like that for less than the cost of a Mac Mini, it'll be my media center PC/home intranet server in a heartbeat. I'm looking for something with enough oomph to play SD video froma ha
          • Best place to get a PPC would be in an embedded device ... unfotunately not a lot of people [re: any] make desktop computers based on the PPC.

            It's one of those chicken an egg problems. It would likely cost a good $100,000 or so [if you paid yourself nothing] to develop a desktop kit based on the PPC on your own. Who has that money? And since the mass public are so blinded by mindless advertisement they're not seeking alternatives.

            "will this PC run my win3.1 programs?" etc...

            I got my PPC from http://www. []
            • You mean I can save money on my electricty bill AND my hardware bill by just buying an incredibly out dated machine???

              Do you cook your food on an open fire in your garden by any chance?

              If you think "it must take a lot of power to be good and that's all there is about that" you're sadly mistaken. Further look at things like the PS2. It doesn't even have a graphics processor like a typical PC. It's just a cell processor design. The entire kit runs off 70W of power. You can't even run a P4 processor on

              • You mean I can save money on my electricty bill AND my hardware bill by just buying an incredibly out dated machine???

                How is it progress to spend more energy then you have to, to accomplish a goal?

                I mean, what next? Are 3mpg cars better for commuting than 40mpg cars? [assuming relatively equal levels of safety].

                You seem to think everyone plays games, does software builds and transcodes movies. Would you be surprised to learn that most desktops are being used for data entry? Email, web development, word
    • If this "F.E.A.R." game really requires a $500 graphic card to play then they can keep it. It's just a game, you'll play it and be bored within a week. Meanwhile you're still out the $500 and your computer is taking "yet more power" to run.

      I've only played the demo for F.E.A.R., but I was singularly under-impressed by the graphics - or rather, the design and artwork. Yes, it had fully dynamic shadows (but no radiosity), all sorts of refractive shaders and gloss-mapped, normal-mapped and parallax-mapped ever
  • I have a P4 Dual Core 3GHz, 2GB Ram, and PCI-E BFG 7800 GT OC. Video is one step below the best (7800 GTX) and rest is really not that bad. But I can't get more then 1600x1200. I tried to run my 24" LCD on 1920x1200 and it just wasn't playable. Whereas Counter Strike Source gets 84 FPS with max details, 4x FFSA, and running 1920x1200. Guess the next step is to go 7800 SLI. :-/ (OUCH!)
    • If you've already spent that much on a system, going SLI isn't going to kill you too much. It's just another $300-$400 card. Now if you don't have a compatible motherboard and power supply, that might ramp up another $250-$350. Either way, you KNOW you wanna do it. You can't resist. Alma awaits.
  • Dude, Zonk, if some guy from a website writes in with a blurb from their latest article, the least you could do is put " [linky]Someguy from[/linky] writes 'blurb' " in the writeup.
  • by Lord Crc ( 151920 ) on Thursday November 10, 2005 @05:37PM (#14001981)
    Just for kicks, I tried overclocking my rather plain 6800GT card. I saw an almost direct linear relationship between % overclocked and % increase in framerate in F.E.A.R.

    At most I managed to push it from 350 to 410Mhz (no special cooler), which is a 17% increase. The average framerate went from 41 to 48, which is a 17% increase...
    • Though they are probaly close it is more likely a situation where the GPU gains slightly more than a linear increase.

      Some of the GPU is wasted on Overhead, whether it's the GUI or some underlying operations.
  • by Might E. Mouse ( 907610 ) on Thursday November 10, 2005 @05:52PM (#14002154)
    The in-game 'benchmark' is misleading - it's just a fly-by, with no A.I. load on your CPU at all. Given how much amazing A.I. there is in F.E.A.R, the numbers you get from the in-game fly-by are not at all representative of real gameplay performance. In fact, they are artificially inflated. If you want to see the difference between non-playable fly-by runs and *real* human gameplay experience, I suggest you read bit-tech's review of F.E.A.R. [] They proved this benchmark was bollocks three weeks ago, so used FRAPS to measure someone physically playing the game. The results are way different. Unfortunately, the Anandtech benchmark review failed to spot this, so those figures are all wrong too
    • Well, they're saying it's a good GPU benchmark. The face that it doesn't load your CPU during the fly-by is a great thing for those that want to isolate their pure video performance as opposed to game performance. I do agree that they should state that the fly-by has almost no reflection of in-game performance when AI or physics come into play.
  • by row1 ( 930208 ) on Friday November 11, 2005 @12:25AM (#14004954) Homepage
    If you have an ati card (R4XX models) you can get a performance boots by renaming FEAR.exe to anything.exe.
    Running fear 1.02 on an msi x800 xl
    first i ran with FEAR.exe:
    1st run (fps):
    * min 25
    * avg 46
    * max 93
    * 0% below 25
    * 44% between 25 and 40
    * 56% above 40
    2nd run (fps):
    * min 26
    * avg 46
    * max 91
    * 0% below 25
    * 43% between 25 and 40
    * 57% above 40
    then i quit and ran anything.exe
    1st run (fps):
    * min 22
    * avg 42
    * max 103
    * 1% below 25
    * 44% between 25 and 40
    * 55% above 40
    2st run (fps):
    * min 21
    * avg 42
    * max 111
    * 3% below 25
    * 37% between 25 and 40
    * 60% above 40
    not believing it i reverted back to fear.exe and it went back to the first lot of results.
    I dont know whats going on, but the max framerate jumped up by 20 fps as well as the percent above 40fps. The min and avg values went down a little.
  • In other news... (Score:3, Insightful)

    by SanityInAnarchy ( 655584 ) <> on Friday November 11, 2005 @01:28AM (#14005284) Journal
    ... researchers have discovered that "Half-Life 2: Lost Coast" can be used to benchmark computer hardware. In fact, they also discovered that the original "Half-Life 2", as well as "Doom 3", "Farcry", and "Billy Bob's New Shooter" can all be used as benchmarks, because, incredibly, they all use the GPU, for something, sometimes.

    Nothing to see here, PLEASE, IN THE NAME OF ALL THAT IS HOLY, JUST move along.

...there can be no public or private virtue unless the foundation of action is the practice of truth. - George Jacob Holyoake