Matrox Parhelia Benchmarks and Review 180
Crankshaft writes "Matrox Parhelia boards are due to hit retail shelves by the end of the month.
Cards have been sent out and
the
benchmarks are up, showing the card's strengths and weaknesses. You
want great looking Anti-Aliasing? The Parhelia is for you. You want
killer frame rates? You might have to look elsewhere."
Their slogan... (Score:1, Funny)
Hope it's not as shitty as that John Cusack movie.
~"Oh my god! The dead have risen and are voting republican!"
Re:Their slogan... (Score:1)
uhh.. (Score:2, Insightful)
I predict that this card may eventually be popular in high-end workstations, with matrox fans (if there are any), and with people who for one reason or another just don't like nVidia.
It may also take some market share away from ATi, but I don't suspect it will cause a huge dent.
The stats really just don't impress me. Then again, I'm a heavy Windows gamer, and from reading their white papers on this card they must not be trying for the gamer market.
I'm just gonna sit back and wait for the GeForce 5, just like I waited for the Voodoo 5500. Hopefully nVidia doesn't go out of business
Re:uhh.. (Score:3, Interesting)
What I am interested is a card that can pull frame rates near a GF4 card speed but with 2D that is superior and clear to look at. Dual or triple head out put is also a very nice feature for the workstation market.
Re:uhh.. (Score:1, Insightful)
As for the low res/framerate thingy, that's why i think it will be a great card for workstations.
I don't think this card is garbage at all, it has some neat features. It just doesn't pump out the high res framerates/picture quality I'd want in my next gaming rig.
Then again, this is just the first run of this techology. In the current atmosphere of PC technology, in 6 months a newer version of this card may be the cat's meow.
Re:uhh.. (Score:2, Informative)
The triple-headed desktop is probably the most understated feature of this card. Talk to anyone that deals with graphics day-to-day - 3d animators, video editors, graphic designers, and the extra screen real-estate is a big boost in productivity.
Here's a few triple screenshots from Matrox's site. [matrox.com] and I believe that odd three-panel monitor is from Panoram Tech. [panoramtech.com]
D
Re:uhh.. (Score:2)
Now, I could see your argument work if Parhelia was priced and marketed as a budget card, but the price and marketing suggests a GeForce killer. A price maybe a little higher, but not much than a Radeon 8500 might be more reasonable, since it seems much closer to that sort of level than a GeForce 4...
I personally think ATIs are a better deal for Windows only gaming (based on price/performance ratio), but with Linux nVidia becomes the only viable cutting edge candidate (8500 still not 3d accelerated with DRI). I wonder how this new Parhelia card will be with Linux drivers...
Re:uhh.. (Score:2, Informative)
I own latest/greates examples of both and I'm here to personally tell you that it is still very much a Matrox only game, which just had the ante uped.
ATI's and nVidia's feeble attempts at multi-monitor support are a nice try at best, and a total fucking insult at worst.
Re:uhh.. (Score:1)
Exactly. It is rare even today to see an ATI or NVidia card that does the simple things like displaying the desktop as clearly and accurately as my loyal 4MB Matrox Mystique 220.
Re:uhh.. (Score:2)
Say UT2003 the GF4 can go 30% faster in a benchmark but what happens when the going gets tough does one card drop to say 50% speed while the matrox card "might" not drop more than 10% even though its top speed isn;t as fast. This is the kind of information I would like to see in a graphics card review.
I'm not a gamer (Score:2)
Not everybody care about games. I for one, haven't played a game on my box for as long as I had it. But, I stare at this screen for 13 hours a day, and all I care about is that the things I look at (no, it's not pr0n :-) ), doesn't make my eyeballs fall out.
I've got a Matrox G450, and I'm pretty happy about that. AFAIK, Matrox is supportive of Linux, they have themselves released GPLed software. It's not going to be my last Matrox card.
Re:uhh.. (Score:2, Insightful)
Matrox can never get straight their real "intention", or at least -- that's what it seems like anyhow.
They have been trying to blend the workstation / gamer market together into one card ever since the G200s, and well, guess what, it has never been successful. Businesses think it's too "game based", and gamers think they are too slow.
i digress -- they are not all that slow. frankly i do not understand why people go out and get Ti4600s. i mean -- fine, most arguments comes in the form of "it will last me longer for the future generation of games." my ass. if you buy a Ti4600 today, i will bet a dollar to a donut that you will buy the next high end "gamer" card as soon as it comes out too - again in the same predicaments -- no games supports it.
but, unfortunately, that do seem to be the market right now...
getting back -- what should matrox do to gain market share? probabbly separate the two classes -- or, for **'s sake, just do different packaging and put a different skin on the drivers. clocking them a bit different will help too. and then cram that useless "face duplicate" technology into the business model, charge 100bux extra for it, and viola...
yeah, i would get one if it was a bit cheaper too. but right now everything i run is okay on a radeon 7500... but then again i also buy games "late" too -- about a year after they are out, they gets real cheap -- ~10-20 bux or so.
last bit. Hothardware says that the matrox card is "elegant" because it does not have "canned capacitors"? man what a load of crap that is. the "canned caps" are for power regulation, and it's there because chip-caps do not get above a few micro-farads. having or not having "canned capacitors" should absolutely not be a factor upon which you rate a card. if necessary -- i would personally get TONS of capacitors if it means the darn thing runs more stable.
Re:uhh.. (Score:1)
Perhaps the cards are okay but the end result matters most. And when I have to spend half an hour poking settings so my dvi panel gives a proper picture only to lose those setups every time(read: often) I have to upgrade the drivers to fix a little but annoying bug, it's not fun. For a lot of 2D users, the good old Matrox Millennium II and G200 cards are often perfect.
I never expected Matrox to do an all out card. I want to know Matrox provides a steady multiuse desktop card with reliable drivers. nVidia can continue providing high end fast-but-buggy setups for gamers and ATI seems to find their way somewhere in between. And there just might be other players around too.
Isn't that the main thing - when the amount of choices is higher than one, it can't be all bad.
Re:uhh.. (Score:2)
One thing that I am curious about is how adding two more monitors effects all of those bench marks. If there is only minor degredation then this card is farther ahead in the game, IMHO, then people are willing to admit.
Re:uhh.. (Score:2)
It all depends on how you value things. Matrox has consistently put out better Linux support than Nvidia (with their binary drivers) and ATI (which certainly takes their time getting support to cards). They also came to Linux early on, and do a good job of releasing tech specs. There's special Matrox drivers in mplayer for Matrox cards that write directly to the card for even more speed -- something that exists for no other card. Xv and 3d support are solid, without crashes or weirdness.
I've used both a G200 and a G450, and as long as Matrox continues to be the most pro-Linux of the major graphics card vendors, they'll keep my business (by the same token, if they turn away from Linux, I'll find another vendor).
Tom also has a review of it (Score:4, Informative)
Re:Tom also has a review of it (Score:2)
Re:Tom also has a review of it (Score:1)
Well I got suckered in... (Score:2)
it'll be like the old Matrox G400 - runs decent and looks great. I guess it all comes down to speed vs. pretty. Maybe they'll fix it in the drivers! Of course that's what they all say.
Wrong direction (Score:1)
MaximimPC article about Parhelia (Score:1)
Short-sighted (Score:2, Interesting)
I wonder what The Carmack has to say about this card. I'd like to see some benchmarks of the Parhelia running DooM3 at 1024x768 w/ 16xAA. Now that the NDA's are lifted, I hope he'll wake these people up to the fact that there is life after Q3A.
Re:Short-sighted (Score:1)
Oh, until Doom arrives, it will be used, hate or not
Re:Short-sighted (Score:1, Insightful)
Who cares what Carmack thinks, he's not about to say "The Parhelia is the only card for Doom 3" theres no way he'd shoot himself in the foot like that. Think of how many people get software bundles with their new videocards, and often these include games like Quake and it's brethren, but if Carmack makes it known that he's optimized or somehow skewed it so that it's better on the Parhelia, you're not going to find a lot of people getting special versions of Doom 3 with their cards. Unreal Tournament 2003 maybe?
As for waking up and finding life after Quake 3.. what would you suggest these people play? Most everything else is just a clone, or a licensed clone. If you want a fucking beautiful slideshow, theres this game called Myst you might enjoy, and if you want to show off your nice new card, RealMyst.
Anandtech looked at UT 2003 (Score:5, Informative)
In the review Anand attributed it to three things. Quoting from Anand's review:
1) Low GPU Clock (220MHz vs. 250 - 300MHz)
2) Sub optimized drivers
3) A lack of serious occlusion culling technology
Whatever the reasons, the Parhelia didn't score well on one of more anticipated and graphically intensive games that will be released in the near future.
Re:Short-sighted (Score:1)
I'd rather have my game running at 100FPS@1600x1200 with no AA than 35FPS with 16X AA.
Re:Short-sighted (Score:1)
Re:Short-sighted (Score:1)
Re:Short-sighted (Score:2, Informative)
Re:Short-sighted (Score:2)
I don't know too much about it, so if I say more I'll embarass myself, as I probably already have...
Re:Short-sighted (Score:1)
Baka!
Re:Short-sighted (Score:2)
Ignoring for a second that it's quite obvious to anyone who's actually compared 30FPS with 60FPS+ that this isn't true, but if you're averaging 30FPS chances are you're bottoming out on complex scenes around 20 or even 15FPS, and you have zero margin for more complexity.
My old GeForce 256/DDR hit 100FPS in Q3; in MoH:AA using the same engine, there are spots it hits 10FPS. There are even complex Q3 maps which cause it to become noticably jerky, even if average FPS hovers around 100FPS.
So, no, averaging 32FPS really isn't that great.
[strokes his Ti4200]
Re:Short-sighted (Score:2)
Re:Short-sighted (Score:3, Interesting)
Exactly the point: if Parhelia dips as low as 35 fps [tech-report.com] running SS @ 1024x768 with nothing turned on, then you know for a fact that it will be near-unplayable on newer more demanding games.
But, since the reviews available test a whole lot more than just SS and Q3 engine games, we don't need to rely on that obvious deduction. Parhelia also gets its clock cleaned on newer games like Commanche4, and UT 2003, as well as the mini-games in 3DMark, which are supposed to simulate future game engines. The only difference is, with these games you actually need a GF4 to get decent performance.
What I'm concerned about is high-resolution performance with AA enabled. I have no intention of ever again running a game below 1024x768 with AA enabled. Why would you, when the Parhelia can do it without breaking a sweat?
Because the only games that Parhelia can run at 1024x768 with 16x FAA "without breaking a sweat" are the ones "built on old technology" that you denigrated a sentence ago. Check out the scores for UT 2003 with AA and anisotropic filtering [anandtech.com]. Parehlia is most definitely breaking a sweat @ 1024x768. (So, of course, is the GF4.) This is for a game that is going to be out in a month; and not only that, but a deathmatch-oriented game which is most definitely tweaked for high framerates. If you never want to run a game below 1024 with AA, then you better either get used to the games that are already out today, or prepare to upgrade video cards very very often.
I'd like to see some benchmarks of the Parhelia running DooM3 at 1024x768 w/ 16xAA.
Too bad: using current drivers, it won't run it. That's because Parhelia's 16x AA is fragment anti-aliasing, which only AA's the edges of polygons, and thus can't deal with a stencil buffer which is used in Doom3 (and many other games) to render cast shadows. Even assuming a driver fix will allow Parhelia to at least run games with a stencil buffer in FAA mode, the shadows themselves will still have jaggies unless you use the much much slower FSAA mode. Given how large a part shadows play in the Doom3 experience, it's doubtful FAA will be better than no AA at all.
If your goal is to never run below 1024x768 with AA, your best bet is probably to buy a GF4 MX 440 today, ride that as long as it will last, and then upgrade in 12-15 months (before you need a DX8 compatible card, which GF4 MX is not) to whatever the best $150 card then is (i.e. equivalent of today's GF4 Ti4200 or Radeon 8500). You'll get your Doom3 at 1024 with AA, and save $175 over buying the Parhelia today to boot.
I wonder what The Carmack has to say about this card.
Me too; in particular, he (or someone) should be able to inform us where the expected throughput benefit for highly multi-textured games has disappeared to. Unfortunately, given the shrug gamers are going to give this card after this morning, he may not even bother.
Over one billion simultaneously displayed colors (Score:1)
Over one billion simultaneously displayed colors?
hmmm 10^9...
that's a resolution of 10^5 * 10^4 = 100,000 x 10,000, if we assume that to display a billion colors simultaneously you need a pixel per color.
(NB this was a stupidly petty, but mildly amusing comment)
Re:Over one billion simultaneously displayed color (Score:2)
what am I missing about vid cards? (Score:1)
Re:what am I missing about vid cards? (Score:5, Insightful)
Re:futureproof? (Score:2)
My next card will be in the same mold. I think I'll wait about a year before I buy it though, but it depends on wether some new extremely good games really NEEDS a better card.
Re:what am I missing about vid cards? (Score:1)
Re:what am I missing about vid cards? (Score:3, Insightful)
However, those refresh rates are an average over many different sorts of scenes and can drop much lower during intensive scenes - e.g. if you just walked into a rocket arena with 30 guys throwing rockets at each other with particles nice lighting and everything, that's when you probably won't be getting 150 FPS anymore. That's when you really need those FPS.
Also, most people want a bit of headroom for future games.
As well, some people want more quality - better lighting, more polys, etc. If your card can do 150 FPS without anti-aliasing, maybe it can do 85 FPS with funky lighting and AA on (just an example).
However, your original point is correct - excess frame rate beyond your monitor's refresh is not really visible, but the extra power comes in handy for other things.
MeepMeep
Re:what am I missing about vid cards? (Score:2)
You need it because when you are running dolby surround sound and have 12 guys shooting at you, the framerate drops dramatically. That frame-rate-above-refresh-rate is important because you need extra slack for extreme situations!
Re:what am I missing about vid cards? (Score:1)
So why don't you toss the old benchmark away? The only real reason to go to the trouble of writing a new benchmark program is to test and showcase newer graphics card features. However, if you use the latest features that are only available in $250+ graphics cards, the benchmark can't be used to compare the new card to older cards selling for $150 and under. So all you have really done is make a pretty demo.
People who buy the cards that score 300FPS in Quake 3 arena don't pay the money for the Quake 32 Arena framerate, they pay it for the framerate it's going to get in Doom3, and all the games that are coming out in the next year.
Re:what am I missing about vid cards? (Score:1)
60 Hertz, that's kind of like a strobe light... on most monitors.
Re:what am I missing about vid cards? (Score:2)
There's a big difference between refresh rate and frame rate--if you truly meant you can tell the difference between 60 frames and 75 frames per second, you are far more perceptive than I.
More reviews (Score:5, Informative)
Here:http://www.tomshardware.com/graphic/02q2/0
And here:http://www.theinquirer.net/24060221.htm
Re:More reviews (Score:1, Informative)
http://www.anandtech.com/video/showdoc.html?i=1645 [anandtech.com]
http://www.tomshardware.com/graphic/02q2/020625/in dex.html [tomshardware.com]
http://www.theinquirer.net/24060221.htm [theinquirer.net]
Re:More reviews (Score:1)
Copy and paste instructions:
Linux Users: Left click, drag over the text, middle click in your browser address bar.
Windows Users: Left click, drag over the text, CTRL+C, left click in your browser address bar, CTRL+V.
Mac Users: Uhm.. Forget it.
ogl2.0 standard isnt even frozen .. (Score:1, Insightful)
i mean, play a game written for the gf3 on a gf4 system. its slower then on a gf4 because the gf4 is lacking the features the gf3 has so it has to take a non optimal path.
the specs from the new matrox card suggest a more general aproach to the rendering pipeline, not that fixed hardware nvidia vertex/pixelshader crap.
just wait until its actualy USED.
Re:ogl2.0 standard isnt even frozen .. (Score:2, Informative)
Only if you are stupid and buy into the gf4MX line, if you get a gf4Ti then you have all the capabilities of a gf3 and more so no game which is optimized for the gf3 feature set would run slower. By the way no game these days is written for a specific card, some design decisions may be swayed as to what max quality features to support based on a particular cards capabilities, but they should run on any card that meets either opengl1.2 or D3D(version X where X>=7) specs.
Matrox isn't for everyone... (Score:1)
I honestly hope that this new board may stand up a bit more to the Nvidia Giant in terms of performance, but realistically I don't think that'll happen.
Re:Matrox isn't for everyone... (Score:2)
Please note that it also was the last one; as of the G450 TV-out doesn't work anymore under Linux (and the Parhelia probably won't be much better). Check out the G400 prices on ebay. They're good business, probably because of this. [source: mplayerhq [mplayerhq.hu]]
More reviews (Score:2, Interesting)
Anandtech [anandtech.com] and Tom's Hardware [tomshardware.com] have also posted reviews.
Anand says that it isn't worth $400, especially in terms of frames per second. And Geforce4 Ti 4600s are only $300 online and the Radeon 8500 is only about two benjamins, and both offer better performance.
Re:More reviews (Score:1)
See benchmarks comparing the cards with best possible image quality (AA on). Now talk about better performance.
16X FAA (Score:1)
I would love to try the Surround Gaming. I'm sure it would give me an edge playing Quake 3.
Re:16X FAA (Score:2)
GeForce4 Ti4400 (Score:2)
Surround Gaming (Score:3, Interesting)
Matrox is noted more for it's workstation class video cards than it's 3-D gaming abilities, but after seeing some of the info on "surround gaming," I don't know . . . I sure wouldn't mind playing the lastest game spanning 3 nice LCD monitors.
For more info, you should check out [matrox.com]
http://www.matrox.com/mga/3d_gaming/surrgame.cf
http://www.matrox.com/mga/3d_gaming/enhanced_ga
Re:Surround Gaming (Score:1)
Also if you use LCD's make sure you get quality displays with a reasonable viewing angle. You can't exactly look at all three displays directly. But, then again 3 CRT's would take up an AWFULL lot of desk space.
Hmmm so this basically is another cool sounding feature from Matrox that is rendered almost useless by reality (anyone else recall 'headcasting'
GF4 TI4200. (Score:1)
After looking at 4 reviews I did notice one thing, though. The image quality appears to be pretty good on the parhelia, but I think the AA from the Geforce4 line can pretty much match the quality of the matrox. Hell, even the radeon's isn't too bad.
If someone is going to blow $400 on this card, they'd best be just using it for the triple-head display.. I think they'd be disappointed if it was just for a single-monitor setup.
Re:GF4 TI4200. (Score:2)
Since the Geforce 4 actually works correctly in my machine, I'll take it over the 8500 any day, but Nvidia has to take some lessons from ATI in the anti-aliasing department.
Not everybody is a gamer (Score:3, Interesting)
Personaly I have a rig for productivity a few rigs for games and the laptops for running around. Productivty machines get multiple monitors and nice cards with soso procs and should be nice and quiet. Gaming machines hey if it sounds like a 747 it's ok as long as you cant hear it over the rocket jumps.
Links to other articles (Score:2, Informative)
Anandtech [anandtech.com]
Tom's Hardware [tomshardware.com]
Tech Report [tech-report.com]
Extreme Tech [extremetech.com]
Matrox cards always look good on paper (Score:5, Insightful)
1. The hardware never lives up.
2. Three words: drivers, drivers, drivers. Or, should I have just said "software, software, software"? Putting out beta drivers and leaving them out there for over a year without a final release is par for the course at Matrox. OpenGL promises, you say? Yeah, been there, done that.
I won't trust Matrox ever again. I was screwed by two generations of cards. Yeah, yeah, shame on me for trusting them a second time.
Re:Matrox cards always look good on paper (Score:1, Insightful)
Every card I have ever purchased has never come with a worthwhile driver. The Geforce 2 Ti I bought hasn't had even remotely good drivers until the last revision, and the card still occasionally causes a bsod under winxp.
I think that these companies need to get on top of writing better drivers for their cards because simply put, the card may be the l33test thing ever, but it won't beat an S3 trio off of the starting line w/o good drivers...
Re:Matrox cards always look good on paper (Score:1)
if you want something that works good, out of the box quickly, get nvidia. if open source is your game, then roll the dice with ati, but do your research on the chipset first. anything else, and you're probably stuck using mame to play asteroids.
Re:"software, software, software"? (Score:1)
Just sai "Developers, developers, developers, developers, developers, developers, developers, developers, developers ..."
Re:Matrox cards always look good on paper (Score:1, Informative)
2: Matrox has NEVER promoted their current cards as 3D cards, who told you to buy one for gaming? Give them shit, not matrox.
3: From PERSONAL experience, Matrox has traditionally supplied the most stable drivers with the most features RIGHT OUT OF THE BOX!!! You can go pull an old G400 card off the shelf of some store room, plug it in, install the drivers and it'll be as stable as if you went and downloaded their latest (Obviously you'd have to to get support for the newer OS's but the point is valid)
4: THERE IS ABSULUTELY NOTHING DRIVERS CAN DO TO ADD FUNCTIONALITY THAT DOESN'T ALREADY EXIST IN THE HARDWARE ITSELF!!!
I think you'd find that if you had taken the time to properly inform yourself on your purchase, you would have avoided this problem. I have been using a matrox G450 dual monitor card for over 2 years at work, never once have I wanted to change it and get another card, never once have I had any problems of any sort. At home however, where I do my gaming, you certainly won't find any matrox cards. (A great big stack of various s3, ati and nvidia cards, but nary a matrox in sight.)
In conclusion, please don't slam a very good company for your own failure to be an informed consumer.
Re:Matrox cards always look good on paper (Score:4, Insightful)
There post-G200 cards were certainly promoted as 3d cards. They came with lots of 3d games in their boxes. I'm not saying they were bad cards, but they were certainly sold as 3d cards.
3: From PERSONAL experience, Matrox has traditionally supplied the most stable drivers with the most features RIGHT OUT OF THE BOX!!! You can go pull an old G400 card off the shelf of some store room, plug it in, install the drivers and it'll be as stable as if you went and downloaded their latest (Obviously you'd have to to get support for the newer OS's but the point is valid)
Speaking of ancient cards, it's worth noting that the nvidia detonator drivers are the very same for everything from TNT to Geforce 4, despite the very radical differences between them. No, the ones out of the box may not have been so great--but for likely all eternity you'll be able to download new, great ones, with lots of new features, which brings me to...
4: THERE IS ABSULUTELY NOTHING DRIVERS CAN DO TO ADD FUNCTIONALITY THAT DOESN'T ALREADY EXIST IN THE HARDWARE ITSELF!!!
Man, how could you have a "great big stack" of video cards and honestly say something so contrary to reality? There's a whole lot of software in between your game and your graphics card. Obviously, by upgrading this software, you can get improvements in frame rate, quality, and yes, features. Like the time I downloaded the new nvidia driver and, suddenly, I had anti-aliasing. Like every time you I download a new DirectX SDK and look at all the demos--they do stuff previous versions could not do. Software and hardware have to work together to give you a good user experience.
Re:Matrox cards always look good on paper (Score:2)
Re:Matrox cards always look good on paper (Score:2)
>>>>>>>>>
Like the abysmal OpenGL drivers in the G200 era?
Re:Matrox cards always look good on paper (Score:2)
Re:Matrox cards always look good on paper (Score:2)
This new card... (Score:3, Interesting)
Matrox's new card has one feature that no other card can match yet, and that's the three monitor support. There is no other single card that has the low price and three monitor support. And no AGP/PCI solution will let you play one game on three monitors.
I don't think that I'll be buying one, but that's just because I don't think that it should cost 50% of your systems total value for the video card.
Re:This new card... (Score:2)
I completely agree that 3 monitors will be awesome, but the problem is that each monitor is capped at 1280x1024. and if you want to play a game, then each monitor is capped at 800x600. How many people would love 3 monitors at that rez?
Re:NVIDIA doesn't have Tv-out? (Score:2)
Tradeoff (Score:2)
Optimizing for (subjective quality)*(framerate) may have an optimum at a lower resolutions than cards without AA.
Take a trip down memory lane, and hit the brakes! (Score:2, Insightful)
Why does this so vividly remind me of 3Dfx vs NVidia a little over two years ago ? 3Dfx had their uber-AA system, but it would drag Quake3 to about 8 frames per second while the butt-ugly TNT2 just cruised along at a clean 40 fps (which was remarkable back then). 3Dfx collapsed months later when they learned one of the golden rules of computing : quantity over quality.
Granted, Matrox' prime market isn't the gaming sector, they've truly carved their throne in the business sector, filling in the gap left behind by Number Nine, but now they're trying to market at the gamers with this feature-packed chipset, yet I fear they're going to fall flat on their ass just like last time. If we've learned anything from NVidia, it's that people are willing to buy gobs of GPU power at insane prices. Your Geforce2 is too slow to play UT2 in 1600x1280 ? Then get the Geforce4, with two GPUs this time for more power.
If Matrox wants a share of my gaming budget, they'll have to start putting more raw goodness into their boards. Heck, just figure out how to link two or four Matrox GPUs and make then spew pixels like there's no tomorrow.
clock speeds (Score:1, Insightful)
Going by what I have seen so far, the P-512 does well on possible "future" applications (aka games)but poorly compared cards with higher clock speeds when using "current" applications. Think about how the GF3 was slower than a GF2Ultra on DX7 games due due to the GF3 being 200MHz while the GF2U was 250MHz. Once DX8 applications came along, the GF3 spanked the GF2U thanks to the features unique to GF3 (at that time)
Maximum PC has extensive review (Score:2)
Unfair tests (Score:3, Interesting)
Many were done at 640x480 which is not the card's strength.
Give the card games with huge textures and run it at 1280x1024 and above and you will see how it outshines even nVidia's best offer.
However, there are not many games (any at all?) which really can stress this card at that level. So, apparently Parhelia buyers must have to see the nVidia GeForce and ATI Radeon cards be better suited for today's games.
As usual it is a question of the hen and the egg. Which comes first? The game or the card.
Parhelia appears today a tad early to the market.
It'll get better (Score:3, Insightful)
How much better it'll get is a valid speculative point. Did they hire any of the old 3Dfx crew?
Tom's hardware review - Bogus benchmark... (Score:1)
I don't understand why Tom's decided to include the "Sharkmark" benchmark in their review.
Re:Tom's hardware review - Bogus benchmark... (Score:1)
Just look at the benchmarks at HotHardware [hothardware.com]; you will see that in NVIDIA's own benchmarks Parhelia it's aprox. 6% behind GF4. And remember that P's drivers are far from mature, compared to NVIDIAs...
Re:Tom's hardware review - Bogus benchmark... (Score:2)
Neverwinter Nights benchmarks? (Score:4, Insightful)
I seriously doubt that people are buying these cards to play Quake or Serious Sam, so why is it always these games that get benchmarked?
Because various 3D engines use different technologies, and these are in turn supported differently by the card manufacturers, it's not possible to simply extrapolate from Quake results to NWN.
Re:Neverwinter Nights benchmarks? (Score:2)
Re:Neverwinter Nights benchmarks? (Score:2)
Linux drivers? (Score:1)
Re:Linux drivers? (Score:1)
Re:Linux drivers? (Score:2)
matrox multi-heading and linux (Score:1)
Matrox is the 2D king, not 3D (Score:1)
$400 too much? (Score:1)
What about triple head output?
What about 10-bit per channel RAMDACs?
What about 5th order output filter?
What about jitter-free sync signals?
What about 80mio transitor GPU?
Clearly the biggest transistor count among cards of this category. Hello? The transistors are there for something (and they cost)! They are not there for yesterdays games but for future ones. They do edge based antialiasing which means beautiful graphics and no blurred fonts or 2d images anymore. Hardware displacement maping sure also needs a lot of transistors.
See the Matrox SharkMark benchmark to see what the card is capable of, once the games start to use new technologies.
ATI:91fps, NV:111fps, Matrox:166fps.
Also see benchs comparing best image quality performance (AA on).
Q3 1600x1200x32: NV:37fps, Matrox:41fps (source for all benchs tomshardware.com).
OK, ready for the flames
The Tech Report review (Score:2)
Everything's a bit light on right now, as most sites only received their hardware late last week.
Carmack's .plan (Score:2)
You can also finger his email account if you are so inclined. (But you spammers will have to figure it out the address for yourselves.)
To summarize:
not as fast as a GF4 or Radeon
AA is nice, but not fast enough
10 bit color is nice, but not nice enough
drivers suck...at the moment
Re:What is it? (Score:1, Insightful)
I'm just bored, look on google. I might be right tho.
It is... (Score:1)
Ya, I'm bored too
Re:It is... (Score:1)
ha-ha-ha (Score:1)
While all you carpet rats are /.ing hothardware.com, I'm enjoying my 2nd seemless page of the Parahelia's review brought on by Tom's Hardware. Oh Sh1t!! I take that back! Tom's doesn't have a review! Don't visit tomshardware.com! Ahh damn you monkies; there went my progress of page 3 of 20 down the /.ing drain. (*cries*)