Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software

Matrox Parhelia Benchmarks and Review 180

Crankshaft writes "Matrox Parhelia boards are due to hit retail shelves by the end of the month. Cards have been sent out and the benchmarks are up, showing the card's strengths and weaknesses. You want great looking Anti-Aliasing? The Parhelia is for you. You want killer frame rates? You might have to look elsewhere."
This discussion has been archived. No new comments can be posted.

Matrox Parhelia Benchmarks and Review

Comments Filter:
  • "Matrox's new Parhelia GPU has a slogan behind it that is supposed to deliver the mission statement of the product, High Fidelity Graphics..."

    Hope it's not as shitty as that John Cusack movie.

    ~"Oh my god! The dead have risen and are voting republican!"
  • uhh.. (Score:2, Insightful)

    this card is not woth the asking price. It's nice to see Matrox trying to get back into the game, but the technology in the card is well... so last year.

    I predict that this card may eventually be popular in high-end workstations, with matrox fans (if there are any), and with people who for one reason or another just don't like nVidia.

    It may also take some market share away from ATi, but I don't suspect it will cause a huge dent.

    The stats really just don't impress me. Then again, I'm a heavy Windows gamer, and from reading their white papers on this card they must not be trying for the gamer market.

    I'm just gonna sit back and wait for the GeForce 5, just like I waited for the Voodoo 5500. Hopefully nVidia doesn't go out of business :P.
    • Re:uhh.. (Score:3, Interesting)

      by -douggy ( 316782 )
      To be brutally honest I think I would much rather have one of these in my pc and any of the new Nvidia GF4/5 cards. I play games at 800x600 or 1024x768 which is the max nice res on a 17inch monitor. I dont really care about much abouve 50-75FPS.

      What I am interested is a card that can pull frame rates near a GF4 card speed but with 2D that is superior and clear to look at. Dual or triple head out put is also a very nice feature for the workstation market.
      • Re:uhh.. (Score:1, Insightful)

        Like I said, just wait. The 5's should have the whole 'peripheral vision' thing too.

        As for the low res/framerate thingy, that's why i think it will be a great card for workstations.

        I don't think this card is garbage at all, it has some neat features. It just doesn't pump out the high res framerates/picture quality I'd want in my next gaming rig.

        Then again, this is just the first run of this techology. In the current atmosphere of PC technology, in 6 months a newer version of this card may be the cat's meow.

      • Re:uhh.. (Score:2, Informative)

        by blink3478 ( 579230 )

        The triple-headed desktop is probably the most understated feature of this card. Talk to anyone that deals with graphics day-to-day - 3d animators, video editors, graphic designers, and the extra screen real-estate is a big boost in productivity.

        Here's a few triple screenshots from Matrox's site. [matrox.com] and I believe that odd three-panel monitor is from Panoram Tech. [panoramtech.com]

        D
      • Well, if you get a decent DVI monitor and an ATI or GeForce with DVI output then a great deal of the Matrox advantage becomes moot, their great RAMDAC. The 2D quality is almost entirely differentiated by the RAMDACs. Multi-Head is no longer a Matrox only game. The one good thing to be said about this new chip is that it doesn't suffer as huge a performance hit with AA as competitors do, but the baseline performance is so poor that the point becomes moot, as raw performance is roughly equal.

        Now, I could see your argument work if Parhelia was priced and marketed as a budget card, but the price and marketing suggests a GeForce killer. A price maybe a little higher, but not much than a Radeon 8500 might be more reasonable, since it seems much closer to that sort of level than a GeForce 4...

        I personally think ATIs are a better deal for Windows only gaming (based on price/performance ratio), but with Linux nVidia becomes the only viable cutting edge candidate (8500 still not 3d accelerated with DRI). I wonder how this new Parhelia card will be with Linux drivers...
        • Re:uhh.. (Score:2, Informative)

          by GeckoX ( 259575 )
          Have you tried multi-monitor offerings from ATI and nVidia?

          I own latest/greates examples of both and I'm here to personally tell you that it is still very much a Matrox only game, which just had the ante uped.

          ATI's and nVidia's feeble attempts at multi-monitor support are a nice try at best, and a total fucking insult at worst.
      • "What I am interested is a card that can pull frame rates near a GF4 card speed but with 2D that is superior and clear to look at. Dual or triple head out put is also a very nice feature for the workstation market."

        Exactly. It is rare even today to see an ATI or NVidia card that does the simple things like displaying the desktop as clearly and accurately as my loyal 4MB Matrox Mystique 220.

      • One thing that a lot of sites arn't looking at is MINIMUM framerate. Sure a GF4 might pull 200fps in a quake3 benchy and that is fine but hwat about future games?

        Say UT2003 the GF4 can go 30% faster in a benchmark but what happens when the going gets tough does one card drop to say 50% speed while the matrox card "might" not drop more than 10% even though its top speed isn;t as fast. This is the kind of information I would like to see in a graphics card review.
    • Well, the review is seriously /.ed, so I don't know what it says, but...:

      Not everybody care about games. I for one, haven't played a game on my box for as long as I had it. But, I stare at this screen for 13 hours a day, and all I care about is that the things I look at (no, it's not pr0n :-) ), doesn't make my eyeballs fall out.

      I've got a Matrox G450, and I'm pretty happy about that. AFAIK, Matrox is supportive of Linux, they have themselves released GPLed software. It's not going to be my last Matrox card.

    • Re:uhh.. (Score:2, Insightful)

      by lingqi ( 577227 )
      Heh... you are right:

      Matrox can never get straight their real "intention", or at least -- that's what it seems like anyhow.

      They have been trying to blend the workstation / gamer market together into one card ever since the G200s, and well, guess what, it has never been successful. Businesses think it's too "game based", and gamers think they are too slow.

      i digress -- they are not all that slow. frankly i do not understand why people go out and get Ti4600s. i mean -- fine, most arguments comes in the form of "it will last me longer for the future generation of games." my ass. if you buy a Ti4600 today, i will bet a dollar to a donut that you will buy the next high end "gamer" card as soon as it comes out too - again in the same predicaments -- no games supports it.
      but, unfortunately, that do seem to be the market right now...

      getting back -- what should matrox do to gain market share? probabbly separate the two classes -- or, for **'s sake, just do different packaging and put a different skin on the drivers. clocking them a bit different will help too. and then cram that useless "face duplicate" technology into the business model, charge 100bux extra for it, and viola...

      yeah, i would get one if it was a bit cheaper too. but right now everything i run is okay on a radeon 7500... but then again i also buy games "late" too -- about a year after they are out, they gets real cheap -- ~10-20 bux or so.

      last bit. Hothardware says that the matrox card is "elegant" because it does not have "canned capacitors"? man what a load of crap that is. the "canned caps" are for power regulation, and it's there because chip-caps do not get above a few micro-farads. having or not having "canned capacitors" should absolutely not be a factor upon which you rate a card. if necessary -- i would personally get TONS of capacitors if it means the darn thing runs more stable.
    • nVidia has clearly a place with gamers but as we've seen from the lousier than lousy attempts such as Quadro2, they're really not for the high end workstation. I don't play that much anymore, perhaps they're great gaming cards, but in 2D use, dtp, desktop and stuff, you just end up with a lot of little bugs with nVidia drivers.

      Perhaps the cards are okay but the end result matters most. And when I have to spend half an hour poking settings so my dvi panel gives a proper picture only to lose those setups every time(read: often) I have to upgrade the drivers to fix a little but annoying bug, it's not fun. For a lot of 2D users, the good old Matrox Millennium II and G200 cards are often perfect.

      I never expected Matrox to do an all out card. I want to know Matrox provides a steady multiuse desktop card with reliable drivers. nVidia can continue providing high end fast-but-buggy setups for gamers and ATI seems to find their way somewhere in between. And there just might be other players around too.

      Isn't that the main thing - when the amount of choices is higher than one, it can't be all bad.
    • From what I have been able to tell Matrox has never really gone after "serious" gamers like yourself. The are going after power users who game occasionally. I bought a G400 Max because it had dual monitor support (which, by the way, was beautiful until my older monitor gave up the ghost). I would definately go out and buy something like this if I could afford buying two more 17" monitors. The fact that it can compete with other high end gaming cards is an added benefit when I do game but it is not the primary reason for me to buy a card.

      One thing that I am curious about is how adding two more monitors effects all of those bench marks. If there is only minor degredation then this card is farther ahead in the game, IMHO, then people are willing to admit.
    • this card is not woth the asking price

      It all depends on how you value things. Matrox has consistently put out better Linux support than Nvidia (with their binary drivers) and ATI (which certainly takes their time getting support to cards). They also came to Linux early on, and do a good job of releasing tech specs. There's special Matrox drivers in mplayer for Matrox cards that write directly to the card for even more speed -- something that exists for no other card. Xv and 3d support are solid, without crashes or weirdness.

      I've used both a G200 and a G450, and as long as Matrox continues to be the most pro-Linux of the major graphics card vendors, they'll keep my business (by the same token, if they turn away from Linux, I'll find another vendor).
  • by BitchAss ( 146906 ) on Tuesday June 25, 2002 @12:52PM (#3763421) Homepage
    It says about the same thing that the above review does. Here's the link [tomshardware.com]
  • ...by the "omg this is gonna rock" fanboy hype. It looked great on paper, but the GF4 4600 gives it a right good spanking. Well, there are going to be lots of artists trying to get their hands on this one...

    it'll be like the old Matrox G400 - runs decent and looks great. I guess it all comes down to speed vs. pretty. Maybe they'll fix it in the drivers! Of course that's what they all say.
  • This technology is overrated. We should really be exploring the full potential of flip-book technology.
  • The latest issue of MaximumPC [maximumpc.com] has a nice write up about the Parhelia. Looks nice, but I'd still put my money on a Geforce4 Ti :)
  • Short-sighted (Score:2, Interesting)

    by ubergeek ( 91912 )
    I for one am a little troubled by the attitude displayed in all the available "reviews." Their major concern seems to be frame rates in SS and Q3A, two games built on old technology. What I'm concerned about is high-resolution performance with AA enabled. I have no intention of ever again running a game below 1024x768 with AA enabled. Why would you, when the Parhelia can do it without breaking a sweat?

    I wonder what The Carmack has to say about this card. I'd like to see some benchmarks of the Parhelia running DooM3 at 1024x768 w/ 16xAA. Now that the NDA's are lifted, I hope he'll wake these people up to the fact that there is life after Q3A.
    • Yea you are short sighted, latest quake III (and probably builds before,called point releases) supports pentium 4, its extensions...

      Oh, until Doom arrives, it will be used, hate or not
    • Re:Short-sighted (Score:1, Insightful)

      by Anonymous Coward
      How else would you like these benchmarks done, they don't exactly hand out Doom 3 Alpha binaries to just anyone. Besides, if the Parhelia doesn't generate massive framerates on 'old tech' like Quake 3, what do you think it's going to do on Doom 3?

      Who cares what Carmack thinks, he's not about to say "The Parhelia is the only card for Doom 3" theres no way he'd shoot himself in the foot like that. Think of how many people get software bundles with their new videocards, and often these include games like Quake and it's brethren, but if Carmack makes it known that he's optimized or somehow skewed it so that it's better on the Parhelia, you're not going to find a lot of people getting special versions of Doom 3 with their cards. Unreal Tournament 2003 maybe?

      As for waking up and finding life after Quake 3.. what would you suggest these people play? Most everything else is just a clone, or a licensed clone. If you want a fucking beautiful slideshow, theres this game called Myst you might enjoy, and if you want to show off your nice new card, RealMyst.
    • by pm ( 11079 ) on Tuesday June 25, 2002 @01:20PM (#3763613)
      The review at Anandtech (http://www.anandtech.com) uses the latest demo from Unreal Tournament 2003 as one of the core benchmarks. It didn't make much of a difference. In the review, the Parhelia scored about as well as Radeon 8500LE.

      In the review Anand attributed it to three things. Quoting from Anand's review:

      1) Low GPU Clock (220MHz vs. 250 - 300MHz)
      2) Sub optimized drivers
      3) A lack of serious occlusion culling technology

      Whatever the reasons, the Parhelia didn't score well on one of more anticipated and graphically intensive games that will be released in the near future.
    • From what I've seen the parhelia doesn't run too spectacularly at high resolutions (read: 1600x1200) with AA of any sort.

      I'd rather have my game running at 100FPS@1600x1200 with no AA than 35FPS with 16X AA.
      • The truth is, with your game running at 1600x1200, anti-aliasing isn't going to do much for you anyway. At that level, the pixel jaggies can become unnoticable.
      • Um, you do realize that you cannot perceive framerates over around 32 fps don't you? In other words, you're wasting horsepower running at 100fps and would be way better off running the 16X AA at 35FPS.
        • Re:Short-sighted (Score:2, Informative)

          by agallagh42 ( 301559 )
          Yes you can. On a non-interlaced screen, framerate is noticeable up to 60ish. I saw a great little demo with little white cubes sliding across the screen. The one going at 30 fps looked very bad, and the one at 60 was nice and smooth. It's when the framerate goes beyond the refresh rate of the monitor that it gets silly...
        • Try looking up Nyquist frequency for the problem in this thinking--basically, you'll need 64 fps to convince a human that they're seeing 32fps...

          I don't know too much about it, so if I say more I'll embarass myself, as I probably already have...
        • Um, you do realize that you cannot perceive framerates over around 32 fps don't you?

          Baka!
        • Um, you do realize that you cannot perceive framerates over around 32 fps don't you?

          Ignoring for a second that it's quite obvious to anyone who's actually compared 30FPS with 60FPS+ that this isn't true, but if you're averaging 30FPS chances are you're bottoming out on complex scenes around 20 or even 15FPS, and you have zero margin for more complexity.

          My old GeForce 256/DDR hit 100FPS in Q3; in MoH:AA using the same engine, there are spots it hits 10FPS. There are even complex Q3 maps which cause it to become noticably jerky, even if average FPS hovers around 100FPS.

          So, no, averaging 32FPS really isn't that great.

          [strokes his Ti4200]

    • Using Q3A and SS to benchmark a brand new card are important because it gives the card a context that can be compared to other cards. If it gets 45fps in UT 2003 or 20fps in Doom 3 what the hell does that mean? Is it a shit card or are the graphics ultra untense? Now say I run a benchmark using Q3A, the card get 300fps. Damn that is much faster than my _insert card here_. Benchmarking is all about reducing the number of variables as close to one as you possibly can. Benchmarks are not about running the latest and greatest games to give you a consumer report on them. It would also be ridiculous to "benchmark" a card using 16x FSAA when no other card could do that. You would have nothing to compare it to removing all context from the test. You could classify performance as "good" or "bad" but not compare it directly to some other card.
    • Their major concern seems to be frame rates in SS and Q3A, two games built on old technology.

      Exactly the point: if Parhelia dips as low as 35 fps [tech-report.com] running SS @ 1024x768 with nothing turned on, then you know for a fact that it will be near-unplayable on newer more demanding games.

      But, since the reviews available test a whole lot more than just SS and Q3 engine games, we don't need to rely on that obvious deduction. Parhelia also gets its clock cleaned on newer games like Commanche4, and UT 2003, as well as the mini-games in 3DMark, which are supposed to simulate future game engines. The only difference is, with these games you actually need a GF4 to get decent performance.

      What I'm concerned about is high-resolution performance with AA enabled. I have no intention of ever again running a game below 1024x768 with AA enabled. Why would you, when the Parhelia can do it without breaking a sweat?

      Because the only games that Parhelia can run at 1024x768 with 16x FAA "without breaking a sweat" are the ones "built on old technology" that you denigrated a sentence ago. Check out the scores for UT 2003 with AA and anisotropic filtering [anandtech.com]. Parehlia is most definitely breaking a sweat @ 1024x768. (So, of course, is the GF4.) This is for a game that is going to be out in a month; and not only that, but a deathmatch-oriented game which is most definitely tweaked for high framerates. If you never want to run a game below 1024 with AA, then you better either get used to the games that are already out today, or prepare to upgrade video cards very very often.

      I'd like to see some benchmarks of the Parhelia running DooM3 at 1024x768 w/ 16xAA.

      Too bad: using current drivers, it won't run it. That's because Parhelia's 16x AA is fragment anti-aliasing, which only AA's the edges of polygons, and thus can't deal with a stencil buffer which is used in Doom3 (and many other games) to render cast shadows. Even assuming a driver fix will allow Parhelia to at least run games with a stencil buffer in FAA mode, the shadows themselves will still have jaggies unless you use the much much slower FSAA mode. Given how large a part shadows play in the Doom3 experience, it's doubtful FAA will be better than no AA at all.

      If your goal is to never run below 1024x768 with AA, your best bet is probably to buy a GF4 MX 440 today, ride that as long as it will last, and then upgrade in 12-15 months (before you need a DX8 compatible card, which GF4 MX is not) to whatever the best $150 card then is (i.e. equivalent of today's GF4 Ti4200 or Radeon 8500). You'll get your Doom3 at 1024 with AA, and save $175 over buying the Parhelia today to boot.

      I wonder what The Carmack has to say about this card.

      Me too; in particular, he (or someone) should be able to inform us where the expected throughput benefit for highly multi-textured games has disappeared to. Unfortunately, given the shrug gamers are going to give this card after this morning, he may not even bother.

  • Over one billion simultaneously displayed colors?

    hmmm 10^9...

    that's a resolution of 10^5 * 10^4 = 100,000 x 10,000, if we assume that to display a billion colors simultaneously you need a pixel per color.

    (NB this was a stupidly petty, but mildly amusing comment)
  • Most good monitors have a refresh rate of around 85 hertz the best one I have ever seen was 120 hertz. The NVIDIA TI4600 is capable of 150 fps. Why do I care about those extra 30- 65 fps? What am I missing it seems pointless to have the vid card generate frames quicker then my monitor can show then. On that logic I would want the card with the nicest picture with a sync rate = to my monitor right?
    • by GauteL ( 29207 ) on Tuesday June 25, 2002 @01:03PM (#3763509)
      Well a couple of points:
      1. The noted fps is average, not worst case. The actually interesting part for gaming is worst case. If a gaming card gives you 150 fps average, you are mostly pretty sure that worst case is good enough. If a gaming card gives you 40 fps average, you have a bigger chance of hitting unacceptable frame rates. I'd like to see reviewers report worst case though
      2. A 150 fps in todays games does not equal 150 fps in tomorrows games. This means that a a card generating very high frame rates are more future proof than a card that generates 90 fps, which shouldn't be noticably less than the above.
    • It's not the max frame rate, it's the min frame rate that matters. In the midst of a huge fragfest, you don't want your framerate to drop from 85fps -> 20fps. That will get you fragged. I would far prefer to have the extra power that will keep high, sustainable frame rates. Besides, I can't tell the difference between 4x aa and 16x aa in the middle of a firefight.
    • You're right - there's not much point in having FPS exceed the refresh rate of your monitor. I think that quick spin-arounds may be slightly smoother, but it's pretty subjective - that's just the way it feels to me.

      However, those refresh rates are an average over many different sorts of scenes and can drop much lower during intensive scenes - e.g. if you just walked into a rocket arena with 30 guys throwing rockets at each other with particles nice lighting and everything, that's when you probably won't be getting 150 FPS anymore. That's when you really need those FPS.

      Also, most people want a bit of headroom for future games.

      As well, some people want more quality - better lighting, more polys, etc. If your card can do 150 FPS without anti-aliasing, maybe it can do 85 FPS with funky lighting and AA on (just an example).

      However, your original point is correct - excess frame rate beyond your monitor's refresh is not really visible, but the extra power comes in handy for other things.

      MeepMeep
    • "Most good monitors have a refresh rate of around 85 hertz the best one I have ever seen was 120 hertz. The NVIDIA TI4600 is capable of 150 fps. Why do I care about those extra 30- 65 fps? What am I missing it seems pointless to have the vid card generate frames quicker then my monitor can show then. On that logic I would want the card with the nicest picture with a sync rate = to my monitor right?"

      You need it because when you are running dolby surround sound and have 12 guys shooting at you, the framerate drops dramatically. That frame-rate-above-refresh-rate is important because you need extra slack for extreme situations!

    • The commonly used benchmarks change about every 12-18 months. Video card power has been doubling every 6 to 9 months though. This means that if you make a benchmark that has the average new hardware posting framerates of 40FPS(decent), in a year, that same benchmark will be tossing out scores of 100-120.

      So why don't you toss the old benchmark away? The only real reason to go to the trouble of writing a new benchmark program is to test and showcase newer graphics card features. However, if you use the latest features that are only available in $250+ graphics cards, the benchmark can't be used to compare the new card to older cards selling for $150 and under. So all you have really done is make a pretty demo.

      People who buy the cards that score 300FPS in Quake 3 arena don't pay the money for the Quake 32 Arena framerate, they pay it for the framerate it's going to get in Doom3, and all the games that are coming out in the next year.
  • More reviews (Score:5, Informative)

    by InnereNacht ( 529021 ) <paulp@lappensecurity.com> on Tuesday June 25, 2002 @12:59PM (#3763478)
    Here:http://www.anandtech.com/video/showdoc.html?i =1645

    Here:http://www.tomshardware.com/graphic/02q2/02 06 25/index.html

    And here:http://www.theinquirer.net/24060221.htm
  • by Anonymous Coward
    yet they benchmark a card built around it.
    i mean, play a game written for the gf3 on a gf4 system. its slower then on a gf4 because the gf4 is lacking the features the gf3 has so it has to take a non optimal path.

    the specs from the new matrox card suggest a more general aproach to the rendering pipeline, not that fixed hardware nvidia vertex/pixelshader crap.
    just wait until its actualy USED.

    • i mean, play a game written for the gf3 on a gf4 system. its slower then on a gf4 because the gf4 is lacking the features the gf3 has so it has to take a non optimal path.

      Only if you are stupid and buy into the gf4MX line, if you get a gf4Ti then you have all the capabilities of a gf3 and more so no game which is optimized for the gf3 feature set would run slower. By the way no game these days is written for a specific card, some design decisions may be swayed as to what max quality features to support based on a particular cards capabilities, but they should run on any card that meets either opengl1.2 or D3D(version X where X>=7) specs.
  • Matrox has (and probably always will be) considered a great graphics card in terms of features and quality, but not performance. Sure, it's FSAA looks sweet in screenshots and Matrox demos, but it's been my experience that has made me see it as a below average gaming card, at best. Now, just because it may not deliver a higher framerate than Nvidia's killer boards doesn't mean it has nothing to offer. I remember my old G400 Dual Head... first decent card capable of dual-monitor display. Drivers were solid too.

    I honestly hope that this new board may stand up a bit more to the Nvidia Giant in terms of performance, but realistically I don't think that'll happen.

    • I remember my old G400 Dual Head... first decent card capable of dual-monitor display. Drivers were solid too.

      Please note that it also was the last one; as of the G450 TV-out doesn't work anymore under Linux (and the Parhelia probably won't be much better). Check out the G400 prices on ebay. They're good business, probably because of this. [source: mplayerhq [mplayerhq.hu]]

  • More reviews (Score:2, Interesting)

    by wpmegee ( 325603 )

    Anandtech [anandtech.com] and Tom's Hardware [tomshardware.com] have also posted reviews.

    Anand says that it isn't worth $400, especially in terms of frames per second. And Geforce4 Ti 4600s are only $300 online and the Radeon 8500 is only about two benjamins, and both offer better performance.

    • and both offer better performance.

      See benchmarks comparing the cards with best possible image quality (AA on). Now talk about better performance.
  • Anti-Aliasing (16X FAA) looks like it makes a big difference in the quality of the 3D images without sacrificing the speed as much as Matrox's competitors. That will be Matrox's biggest selling point.

    I would love to try the Surround Gaming. I'm sure it would give me an edge playing Quake 3.

    • i remember changing the FOV in quake 1 and 2 was considered a cheat, so servers would not let you on if you were using a nonstandard FOV setting. I would doubt that anything has changed in that arena, so I bet you would not even be able to play online using "surround gaming" since it needs an FOV change.
  • Or you could get a GeForce4 Ti4400 for $220 + shipping and have a card which is still faster and supports only 1 less monitor. I'd *consider* the Matrox if I had dual (or triple?) LCDs due to the dual DVI, but considering at the moment I only have two 19" CRTs I think I'll keep my Ti4400.
  • Surround Gaming (Score:3, Interesting)

    by jaaron ( 551839 ) on Tuesday June 25, 2002 @01:07PM (#3763538) Homepage
    I've got a nice Matrox dual-head video card for my workstation at work and quite honestly I don't know how I did anything before having two nice 20 inch monitors taking up all the desk space in my cubicle. :)

    Matrox is noted more for it's workstation class video cards than it's 3-D gaming abilities, but after seeing some of the info on "surround gaming," I don't know . . . I sure wouldn't mind playing the lastest game spanning 3 nice LCD monitors.

    For more info, you should check out [matrox.com]
    http://www.matrox.com/mga/3d_gaming/surrgame.cfm or check out these screen shots of Jedi Knight II: [matrox.com]
    http://www.matrox.com/mga/3d_gaming/enhanced_gam es / knt2.cfm.
    • The card can only do 2 digital outs. Im not sure but it might be able to do 1 digital + 2 vga. So those LCD's (which still suck for games) would be running via analog signals; although it might not be so bad since matrox cards consistently have the best image quality of any consumer card out there.

      Also if you use LCD's make sure you get quality displays with a reasonable viewing angle. You can't exactly look at all three displays directly. But, then again 3 CRT's would take up an AWFULL lot of desk space.

      Hmmm so this basically is another cool sounding feature from Matrox that is rendered almost useless by reality (anyone else recall 'headcasting' ;).
  • From the looks of things, even in regular 3d gaming the ti4200 AND the Radeon 8500 both put the Parhelia to shame.

    After looking at 4 reviews I did notice one thing, though. The image quality appears to be pretty good on the parhelia, but I think the AA from the Geforce4 line can pretty much match the quality of the matrox. Hell, even the radeon's isn't too bad.

    If someone is going to blow $400 on this card, they'd best be just using it for the triple-head display.. I think they'd be disappointed if it was just for a single-monitor setup.
    • I recently purchased both a Radeon 8500 and a GeForce4 Ti4200 (I went for the radeon first, but had compatibility issues with my motherborad and had to give it to my girlfriend). I've noticed one thing in comparing the two cards. The FSAA on the GeForce4 looks like total crap when compared to the Radeon 8500. The difference is remarkable. The Radeon made things look amazing with 2x AA, and the GeForce4 at 2x makes you check to see if you even turned the AA on after seeing the Radeon. At 4x, things don't improve for the Nvidia card, things just get blurier.

      Since the Geforce 4 actually works correctly in my machine, I'll take it over the 8500 any day, but Nvidia has to take some lessons from ATI in the anti-aliasing department.
  • by silas_moeckel ( 234313 ) <silas@@@dsminc-corp...com> on Tuesday June 25, 2002 @01:11PM (#3763565) Homepage
    Lets face it guys not everybody is a gamer people do still have real work to do and lets face it a lot of the nvida stuff out there is about average for clarity and high end connectivity. This card is set right for the low end DTp and the high end corprate people (for those corps that try not to give all there emploies headaches from looking at those blurry intergrated video on tiny monitors)

    Personaly I have a rig for productivity a few rigs for games and the laptops for running around. Productivty machines get multiple monitors and nice cards with soso procs and should be nice and quiet. Gaming machines hey if it sounds like a 747 it's ok as long as you cant hear it over the rocket jumps.
  • Here are links to the articles at a few other sites that have reviews up:

    Anandtech [anandtech.com]

    Tom's Hardware [tomshardware.com]

    Tech Report [tech-report.com]

    Extreme Tech [extremetech.com]

  • by e40 ( 448424 ) on Tuesday June 25, 2002 @01:11PM (#3763569) Journal
    However, there are always problems:

    1. The hardware never lives up.

    2. Three words: drivers, drivers, drivers. Or, should I have just said "software, software, software"? Putting out beta drivers and leaving them out there for over a year without a final release is par for the course at Matrox. OpenGL promises, you say? Yeah, been there, done that.

    I won't trust Matrox ever again. I was screwed by two generations of cards. Yeah, yeah, shame on me for trusting them a second time.
    • This isn't just an issue with matrox, but the while video card industry.

      Every card I have ever purchased has never come with a worthwhile driver. The Geforce 2 Ti I bought hasn't had even remotely good drivers until the last revision, and the card still occasionally causes a bsod under winxp.

      I think that these companies need to get on top of writing better drivers for their cards because simply put, the card may be the l33test thing ever, but it won't beat an S3 trio off of the starting line w/o good drivers...
    • i've never really used matrox products, but they sound like they've borrowed software practices from ATI.

      if you want something that works good, out of the box quickly, get nvidia. if open source is your game, then roll the dice with ati, but do your research on the chipset first. anything else, and you're probably stuck using mame to play asteroids.
    • Or, should I have just said "software, software, software"?

      Just sai "Developers, developers, developers, developers, developers, developers, developers, developers, developers ..."

    • 1: did you say beta? I think I heard beta...now why were you assuming that beta drivers would do it for you?

      2: Matrox has NEVER promoted their current cards as 3D cards, who told you to buy one for gaming? Give them shit, not matrox.

      3: From PERSONAL experience, Matrox has traditionally supplied the most stable drivers with the most features RIGHT OUT OF THE BOX!!! You can go pull an old G400 card off the shelf of some store room, plug it in, install the drivers and it'll be as stable as if you went and downloaded their latest (Obviously you'd have to to get support for the newer OS's but the point is valid)

      4: THERE IS ABSULUTELY NOTHING DRIVERS CAN DO TO ADD FUNCTIONALITY THAT DOESN'T ALREADY EXIST IN THE HARDWARE ITSELF!!!

      I think you'd find that if you had taken the time to properly inform yourself on your purchase, you would have avoided this problem. I have been using a matrox G450 dual monitor card for over 2 years at work, never once have I wanted to change it and get another card, never once have I had any problems of any sort. At home however, where I do my gaming, you certainly won't find any matrox cards. (A great big stack of various s3, ati and nvidia cards, but nary a matrox in sight.)

      In conclusion, please don't slam a very good company for your own failure to be an informed consumer.
      • by TRACK-YOUR-POSITION ( 553878 ) on Tuesday June 25, 2002 @03:04PM (#3764302)
        2: Matrox has NEVER promoted their current cards as 3D cards, who told you to buy one for gaming? Give them shit, not matrox.

        There post-G200 cards were certainly promoted as 3d cards. They came with lots of 3d games in their boxes. I'm not saying they were bad cards, but they were certainly sold as 3d cards.

        3: From PERSONAL experience, Matrox has traditionally supplied the most stable drivers with the most features RIGHT OUT OF THE BOX!!! You can go pull an old G400 card off the shelf of some store room, plug it in, install the drivers and it'll be as stable as if you went and downloaded their latest (Obviously you'd have to to get support for the newer OS's but the point is valid)

        Speaking of ancient cards, it's worth noting that the nvidia detonator drivers are the very same for everything from TNT to Geforce 4, despite the very radical differences between them. No, the ones out of the box may not have been so great--but for likely all eternity you'll be able to download new, great ones, with lots of new features, which brings me to...

        4: THERE IS ABSULUTELY NOTHING DRIVERS CAN DO TO ADD FUNCTIONALITY THAT DOESN'T ALREADY EXIST IN THE HARDWARE ITSELF!!!

        Man, how could you have a "great big stack" of video cards and honestly say something so contrary to reality? There's a whole lot of software in between your game and your graphics card. Obviously, by upgrading this software, you can get improvements in frame rate, quality, and yes, features. Like the time I downloaded the new nvidia driver and, suddenly, I had anti-aliasing. Like every time you I download a new DirectX SDK and look at all the demos--they do stuff previous versions could not do. Software and hardware have to work together to give you a good user experience.

      • 3: From PERSONAL experience, Matrox has traditionally supplied the most stable drivers with the most features RIGHT OUT OF THE BOX!!! You can go pull an old G400 card off the shelf of some store room, plug it in, install the drivers and it'll be as stable as if you went and downloaded their latest (Obviously you'd have to to get support for the newer OS's but the point is valid)
        >>>>>>>>>
        Like the abysmal OpenGL drivers in the G200 era?
      • You bought an S3 card over a Matrox?
  • This new card... (Score:3, Interesting)

    by Steveftoth ( 78419 ) on Tuesday June 25, 2002 @01:22PM (#3763631) Homepage
    The new Parhelia is a great new card. It won't appeal to everyone, as it doesn't have the frame rates that everyone seems to think are the only indication of performance in graphics today. Everyone seems to think that fillrate == king and that all the other features are secondary. Which is why nVidia sells most of it's cards with only enough features to get them out the door, most don't have tv out, multiple monitors, tv in, etc. ATI has done a great job of creating multiple products that do many different things.

    Matrox's new card has one feature that no other card can match yet, and that's the three monitor support. There is no other single card that has the low price and three monitor support. And no AGP/PCI solution will let you play one game on three monitors.

    I don't think that I'll be buying one, but that's just because I don't think that it should cost 50% of your systems total value for the video card.
    • You're right that some people will like this card and some wont. I, personally, wont. I almost never use any special features for my card. Multiple monitors? don't have it, don't need it. tv in? don't have it, don't need it. tv out? i do have this, but never use it.

      I completely agree that 3 monitors will be awesome, but the problem is that each monitor is capped at 1280x1024. and if you want to play a game, then each monitor is capped at 800x600. How many people would love 3 monitors at that rez?
  • With anti-aliasing pictures may look good enough or better even at lower resolutions leading to higher frame rate.

    Optimizing for (subjective quality)*(framerate) may have an optimum at a lower resolutions than cards without AA.
  • "You want great looking Anti-Aliasing? The Parhelia is for you. You want killer frame rates? You might have to look elsewhere."

    Why does this so vividly remind me of 3Dfx vs NVidia a little over two years ago ? 3Dfx had their uber-AA system, but it would drag Quake3 to about 8 frames per second while the butt-ugly TNT2 just cruised along at a clean 40 fps (which was remarkable back then). 3Dfx collapsed months later when they learned one of the golden rules of computing : quantity over quality.

    Granted, Matrox' prime market isn't the gaming sector, they've truly carved their throne in the business sector, filling in the gap left behind by Number Nine, but now they're trying to market at the gamers with this feature-packed chipset, yet I fear they're going to fall flat on their ass just like last time. If we've learned anything from NVidia, it's that people are willing to buy gobs of GPU power at insane prices. Your Geforce2 is too slow to play UT2 in 1600x1280 ? Then get the Geforce4, with two GPUs this time for more power.

    If Matrox wants a share of my gaming budget, they'll have to start putting more raw goodness into their boards. Heck, just figure out how to link two or four Matrox GPUs and make then spew pixels like there's no tomorrow.
  • clock speeds (Score:1, Insightful)

    by Anonymous Coward
    I am going to wait to see what the overclocking results are like.... I have a feeling the P-512 is loosing due to its lower clock speed rahter than its hardware features/setup. Matrox are rumoured to release a FULL DX 9.0 hardware compliant card next year on a 0.13u process with much higher clock speeds.

    Going by what I have seen so far, the P-512 does well on possible "future" applications (aka games)but poorly compared cards with higher clock speeds when using "current" applications. Think about how the GF3 was slower than a GF2Ultra on DX7 games due due to the GF3 being 200MHz while the GF2U was 250MHz. Once DX8 applications came along, the GF3 spanked the GF2U thanks to the features unique to GF3 (at that time)

  • The lastest issue of Maximum PC has an extensive review of this card, going into great depth on how it works and what the improved functions get you. If you want to know the nitty gritty details of this card, get that issue. The card they tested was a beta, so they warn not to rely on their benchmarks, but they said that Matrox had to pry it from their fingers, particularly after they tried out its triple-screen views for gaming.
  • Unfair tests (Score:3, Interesting)

    by Anonymous Coward on Tuesday June 25, 2002 @01:43PM (#3763763)
    A lot of the tests available have been unfair to the Parhelia.

    Many were done at 640x480 which is not the card's strength.

    Give the card games with huge textures and run it at 1280x1024 and above and you will see how it outshines even nVidia's best offer.

    However, there are not many games (any at all?) which really can stress this card at that level. So, apparently Parhelia buyers must have to see the nVidia GeForce and ATI Radeon cards be better suited for today's games.

    As usual it is a question of the hen and the egg. Which comes first? The game or the card.

    Parhelia appears today a tad early to the market.
  • It'll get better (Score:3, Insightful)

    by T5 ( 308759 ) on Tuesday June 25, 2002 @01:52PM (#3763823)
    The Parhelia is Matrox's first attempt at a competitive 3D card. As the process shrinks, the speeds will go up. And the drivers will mature over time.

    How much better it'll get is a valid speculative point. Did they hire any of the old 3Dfx crew?
  • The Geforce4 card clearly beat the Matrox card using 3D Mark 2001 SE Vertex shader test, but the matrox card wins by a mile when using the "Sharkmark", which is written by matrox. What a suprise.

    I don't understand why Tom's decided to include the "Sharkmark" benchmark in their review.

  • by Dr. Spork ( 142693 ) on Tuesday June 25, 2002 @01:59PM (#3763866)
    It seems to me that NWN is currently the single biggest cause of "forced" graphics card updates. I've seen several reports that even with a GF4-4600 and a top-of-the-line processor, some situations cause unacceptable FPS slowdowns when the AA is enabled (though the game looks otherwise great).

    I seriously doubt that people are buying these cards to play Quake or Serious Sam, so why is it always these games that get benchmarked?

    Because various 3D engines use different technologies, and these are in turn supported differently by the card manufacturers, it's not possible to simply extrapolate from Quake results to NWN.

    • I've been running NWN for about 30-40 hours since it came out and I'd wait for another patch or two before using it as a benchmark of everything. The game runs beautifully at 1600x1200 with the highest anti aliasing settings I can use on a GeForce 3, and athlon xp 1800 system. For about an hour that is. After about an hour the game just starts to get choppier and choppier and choppier. After the game had been running for about 7 hours last night (to preserve the multiplayer settings on the server due to a multiplayer save issue, not because I sat there for 7 hours straight) the game was so choppy I got motion sick. Restarting the game put it back to it's normal beautifully rendered self though.
      • Sounds like a serious memory leak! But as to your main point, the one about NMN being to fresh to be benchmarked reliably, I guess that makes sense. On the other hand, its pickyness about hardware would make a site that provides information about what setups work (and how well) very useful. But yeah, you're right that right now, NWN tests would basically reveal a lot more about the errors of the programmers than intrinsic qualities of the hardware.
  • Will there ever be Linux drivers? I dont care about a bit of performance if there is source code.
  • I have had a Matrox G400card for a few years, I love the dual heading, currently run two 17 inch monitors, and especially the easy setup utility for Xfree86 that Matrox provides on their site, getting the whole setup working under Redhat 7.2 / KDE was no harder than win2k. When I upgrade, or buy a new box, I will probably get the parhelia if my budget allows.
  • Matrox has always been the king of making quality 2D cards with killer features for the business and graphics features. They have never been a compelling 3D company. Their 3D engineering team has always been riding behind the pack. They focus on a few very interesting features to get recognition, but always manage to put out a poor 3D product. This board is their submission for their 2D customers who want an exceptional 2D board including *average* 3D gaming performance. Expect these to be snatched up by the programmers and graphic artists who like to play games once in a while. There are lots of people who will grab them just for the easy to set up triple-head design who never game at all. If you're not going to hook up two or three monitors to this thing, stay away from it like the plague.
  • I read many complaints about $400 price tag too high for no GF4 killer. Think for a few more minutes before making a judgement.

    What about triple head output?
    What about 10-bit per channel RAMDACs?
    What about 5th order output filter?
    What about jitter-free sync signals?
    What about 80mio transitor GPU?

    Clearly the biggest transistor count among cards of this category. Hello? The transistors are there for something (and they cost)! They are not there for yesterdays games but for future ones. They do edge based antialiasing which means beautiful graphics and no blurred fonts or 2d images anymore. Hardware displacement maping sure also needs a lot of transistors.

    See the Matrox SharkMark benchmark to see what the card is capable of, once the games start to use new technologies.
    ATI:91fps, NV:111fps, Matrox:166fps.

    Also see benchs comparing best image quality performance (AA on).
    Q3 1600x1200x32: NV:37fps, Matrox:41fps (source for all benchs tomshardware.com).

    OK, ready for the flames :)
  • Check out www.tech-report.com [tech-report.com] for a good first look. Particularly towards the end, where they show a 16x FAA screenshot & give AA benchmarks. Some aniso filtering reporting too.

    Everything's a bit light on right now, as most sites only received their hardware late last week.

  • Here's John Carmack's take on the Matrox Parhelia. http://www.bluesnews.com/plans/1/

    You can also finger his email account if you are so inclined. (But you spammers will have to figure it out the address for yourselves.)

    To summarize:

    not as fast as a GF4 or Radeon

    AA is nice, but not fast enough

    10 bit color is nice, but not nice enough

    drivers suck...at the moment

Love may laugh at locksmiths, but he has a profound respect for money bags. -- Sidney Paternoster, "The Folly of the Wise"

Working...