Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software

Matrox Parhelia 512 Preview 203

SpinnerBait writes "Finally, you don't have to sift through all the unreleased and unauthorized bogus information around the net about Matrox's upcoming 3D Graphics chip, called the Parhelia 512. Matrox has taken the wraps off their next generation GPU and this Preview over at HotHardware goes through its feature set with a fine toothed comb. They also give you a very rare glimpse inside Matrox's Montreal Headquarters, as well as a look at some very impressive technology demos, rendered on their new chip. Looks like impressive stuff for sure."
This discussion has been archived. No new comments can be posted.

Matrox Parhelia 512 Preview

Comments Filter:
  • Too bad Matrox cards don't play with Macs - nVidia GeForce 4 Ti for me them.

    PS Frist P0st
  • Wow... (Score:2, Interesting)

    by NorthDude ( 560769 )
    This Graphic card as almost more processing power then my two PC's combined! The only thing I wish is that Matrox could come back a bit in this market. They had made some good card in the past. More choices can only be good. And, do they have a good record of supporting Linux in the past? Funny, they are located in my town and I know less about them then all those US based company :)
    • This Graphic card as almost more processing power then my two PC's combined!

      sad sad sad.

      oh wait, the same statement probably applies to my three computers. OUCH.
  • by Anonymous Coward
    Kyle over at [H]ard|OCP has quite a load of info on the card:

    http://www.hardocp.com/articles/parhelia/index.h tm l

    http://www.hardocp.com/articles/parhelia/analysi s. html
  • Is this crap? (Score:1, Insightful)

    by dingo ( 91227 )
    In the article they say:-

    "Gigacolor", as Matrox likes to call it, is otherwise known as full display of over 1 Billion colors. Before you peg the "Marketing-Hype-O-Meter" too far, believe or not, the human eye can definitely tell the difference between 16 Million and 1 Billion colors

    now if I remember correctly there are less than a dozen monitors that can produce this kind of detail(please correct me if I am wrong) and no-one can reallistically tell the difference (once again...please correct me if im wrong).Anyhow i can see something more than 5.2 on the marketing hype-o-meter

    • Well from reading a few artivles about needing more than 8bit per channel, its all down to bleading.
      It's a bit like using 24bit sound recordings to mix and then downsampling them to 16bit.
    • Re:Is this crap? (Score:2, Insightful)

      by hbackert ( 45117 )

      24 bit (16 million colors) are a lot, and I certainly have difficulties to find the difference between color #70e0e0 and #70e1e0, but when you want to have a nice background, top is plain blue (#0000ff) and botton is black (#000000), then there are only 254 levels between those. And I can clearly see those lines where the blue color value changes.

      And that's where the more colors shine. Just using 10 instead of 8 bits reduces those color bands by a factor of 4.

      Instead of not using those alpha bits at all (in 32 bit color mode), one might as well use them for nicer colors. Now which OS supports that mode? X11?

      Harald
      • [...] but when you want to have a nice background, top is plain blue (#0000ff) and botton is black (#000000), then there are only 254 levels between those. And I can clearly see those lines where the blue color value changes.

        Actually, you should make that black to blue to white. And while you'll manage to distinguish the colours at the centre of the scale (near "pure" blue), I doubt you'll be able to distinguish the colours at the top and bottom (near white and black).

        The limitations of 24-bit colour can also be dealt with with dithering. Most high-end animation programs render internally at 48 / 64 bits per pixel (16 bits per component) and then dither the image when they convert it to 24-bpp (8-bpc). This would result in a much smoother transition from black to blue (and then to white), with no visible banding.

        Most modern graphics cards already do real-time dithering, but only in 16-bit modes (which still work internally at 24 / 32).

        RMN
        ~~~
      • I read somewhere (can't recall where now) that some programs (e.g. MSWord) crash in 10/10/10/2 mode. Does this imply that Windows will support this as a native desktop mode? That would be nice - our software would benefit from better colour representation.

        Think I'll talk to my contact at Matrox to see if we can get ahold of one of these and support this mode.

    • It is. (Score:4, Insightful)

      by Rui del-Negro ( 531098 ) on Tuesday May 14, 2002 @10:14AM (#3516932) Homepage
      The human eye can distinguish about 10 million different colours. But it's more sensitive to some frequencies than others, so sometimes 24 bits (16 million colours) may not be enough.

      For example, most people can distinguish between two very similar 24-bit medium greens but not between three or four similar 24-bit dark blues.

      That said, no monitor can accurately represent 16 million colours, let alone several billions. Even if they could, the dynamic range of monitors is very limited compared to the range our eyes can see (ie, monitors have very limited brightness compared to the normal sunlit world), so most of those colours would be wasted.

      Higher colour precision is good because it minimises round-off errors, but this applies mainly to internal calculations (some operations are done directly on the final framebuffer, but very few). For display, 24 bits (and a good monitor) are more than enough.

      RMN
      ~~~
      • Re:It is. (Score:5, Funny)

        by dingo ( 91227 ) <{gedwards} {at} {westnet.com.au}> on Tuesday May 14, 2002 @10:24AM (#3517006) Journal
        For example, most people can distinguish between two very similar 24-bit medium greens but not between three or four similar 24-bit dark blues.

        If i remember back to biology this is because there are lots of green (Natural) foods but not so many blue ones and we therefore have allocated more cones in our eyes to distinguishing greens than blues.
        This is why blue m-m's are an affront to nature :)

        • We can see wavelengths ranging from red to blue (well, it's more from magenta to magenta, actually, but you know what I mean). Green happens to be at the middle of that range (aprox. 535 nm), so naturally we have better colour accuracy in those wavelengths.

          For all I know, some food may have lovely ultra-violet or infra-red shades mixed with their yellow or green, but we just can't see them (some animals can). In fact, if we could "see" much longer wavelengths, we'd have heat vision. Cool but probably a bit confusing.

          It's also interesting that, while our eyes have receptors that are sensitive to YGBL (Yellow, Green, Blue and Luminance), we tend to think in HLS (hue, luminance and saturation).

          Our brain "constructs" the red parts of the image from the other signals. If our eyes see something that has high luminance, some yellow, very little green and no blue, we perceive it as red (note that here when I say "yellow" I mean "something that is detected by our 'yellow' receivers", not pure yellow).

          This is actually similar to the way TV signals are transmitted (a black-and-white signal plus two "difference" colours signals, so it's compatible with both B&W and colour TVs).

          And, of course, not everyone's eyes are calibrated the same, so what is brownish green to one person can be greenish brown to another, and so on.

          RMN
          ~~~

      • "...monitors have very limited brightness compared to the normal sunlit world..."

        Sunlit world? What's that?
    • The fact that the human eye can only see about 16 million colours is true. The displays that we use can only display about as many also. BUT, when you render stuff at 10bit per channel precision you get far fewer rounding errors.
      Think about it this way, you apply 4 texture maps to the same object, and then have a couple light sources... some fog. If everything gets rendered at 32bit colour (8bit per channel) you will get more rounding erros.

      The Matrox Gigacolor will reduce banding and visual artifacts in rendered scenes...

      I would assume that the end result will still be a 32bit image sent to the display.
    • The main effect of offering 10 bits per channel colour will be to reduce banding. For example, your current card can only display 256 shades of pure red: 0xff000000 to 0x00000000. This produces significant banding.

      On still images the difference between 8- and 10-bit colour is not that significant; the human eye does a decent job of interpolating the bands. Where the 10-bit really shines is in moving pictures, eitehr in games or movies. When the Bands move across the screen because the camera is moving past a star, the bands are really evident in 8-bit.
    • Keep in mind that there is a difference in the frequency response of your monitor for luma and chroma!

      We are much more attuned (visually of course) to difference in intensity (luminance) in comparison to difference in color. IT is possible for monitors to have a higher luma response than chroma response, so...
      Having 30 bits worth of intensity variation (ok, call it 10 bits if you will) can provide more distinct rendering, wven on monitors that don't support 1 billion "colors".
  • Don't forget... (Score:4, Insightful)

    by supercytro ( 527265 ) on Tuesday May 14, 2002 @09:45AM (#3516790)
    ...to release decent drivers. Tested and stable would be nice...
  • Hmmm (Score:2, Interesting)

    Good job my pc Blew up the other day, I now have a great excuse to upgrade..

    Or more seriously, I wanted to upgrade before (from my G400) but GForce / ATI have poor 2D performance and some bad filters on there cards which require a bit of hacking to sortish out, and Matrox didn't have a viable home/gamer solution, sure there 10bit medical cards look nice, but not quite for me.

    The only problems i have had in the past with matrox cards are,
    Poor OpenGL support, though the drivers seemed to have been fixed as of Feb this year.
    There Linux support is a little, well patchy. they do provide drivers, but there only half open and a bit of a pain to get working corretly, some of the problems may have been down to old X4 versions though.

    Well I'll Buy one in the next couple of months and try to post a more informed comment!!
    • Re:Hmmm (Score:2, Insightful)

      by statichead ( 66370 )
      I have had nothing but good luck with matrox and linux. I was running open gl with glx at first, then moved to DRI, which has now been integrated into the kernel, it has been getting easier and easier. Also matrox does provide a support forum specifically for linux which has helped me with more then one issue. You may scoff at the binary only drivers that matrox releases, however they are easy to install and provide some nice tools, to make configuration easier. What other manufacturer of video supports linux to this extent?

      Return to castle wolfenstein on a two plus year old card (g400max) with reasonable framerates, I'm ok with that and am looking forward to a new matrox.
  • This stuff sounds very much like Sutherlands stuff he was playing with. Very fine grain asynchronous pipelines with very high throughput.
    Evans & Sutherland were the people who made the military simulators long long time ago
  • Here are some more links with Parhelia info:

    http://www.hardocp.com/articles/parhelia/index.h tm l

    http://www.hardwarezone.com/articles/articles.hw z? cid=3&aid=425

    http://www.matrox.com/mga/products/parhelia512/h om e.cfm

    I hope they make Linux drivers for it. Hardware text AA seems kinda cool.

    • Matrox has been very good at providing drivers for Linux. Some may say that the drivers are a bit spotty, but I have had no problems with them or the configuration tool that they have designed for Linux. If you look at the bottom of the page of the article it says, "Operating Systems: Windows, Linux" so I think this is going to be happening.
  • by cOdEgUru ( 181536 ) on Tuesday May 14, 2002 @09:53AM (#3516820) Homepage Journal
    Here [anandtech.com]

    The summary mentions quite a few interesting notes regarding the effect this card would have on current games.

    - In "simple" games like Quake III Arena, the Parhelia-512 will definitely lose out to the GeForce4 Ti 4600. By simple we mean games that generally use no more than two textures and are currently bound by fill rate. NVIDIA's drivers are highly optimized (much more so than Matrox's) and in situations where the majority of the Parhelia's execution power is going unused, it will lose out to the Ti 4600. This can change by turning on anisotropic filtering and antialiasing however, where the balance will begin to tilt in favor of the Parhelia.

    - In stressful DX8 games, Matrox expects the Parhelia-512 to take the gold - either performing on par or outperforming the GeForce4 Ti 4600. Once again, as soon as you enable better texture filtering algorithms and antialiasing the Parhelia-512 should begin to seriously separate itself from the Ti 4600. The quad-texturing capabilities of the core as well as the 5-stage pixel shaders will be very handy in games coming out over the next several months.

    So from the look of it, Parhelia does not wipe out Nvidia (though I would like them to), but is a worthy competitor to nvidia in current games. It would be interesting to see how ATI and Nvidia match up to this new competitor in the coming months.

    Be afraid. Be vewy vewy afraid.
    • In "simple" games like Q3A, we're already seeing frame rates >100 even in high resolutions which is more than the monitor can handle. In these cases, even a 50% difference isn't a big deal. Also, ISTR that there's limit to what the human eye can see; any frame rates over that are wasted.
      • You're ignoring the reality that framerates of games from this century are much lower than Q3A. With UT2003 and Doom3 coming out this year, I'd be surprised if an average 100 fps & 1024x768x32 is going to be achievable on currently available hardware. Hell, even currently available games like MOH:AA and Dungeon Siege don't manage that.
      • by Namarrgon ( 105036 ) on Tuesday May 14, 2002 @11:37AM (#3517515) Homepage
        That tired old line again? As with everyone else who trots this one out, you're ignoring a number of things:

        - First off, Q3A is used as THE single standard metric to see how a card will perform under a common load. It's a very good way to judge the raw speed of a card overall, and often provides good pointers as to overall performance in fancier modes or other games, but it certainly doesn't mean every game you play will be 100+ fps.

        - Second, that figure is an AVERAGE. When actually gaming, the average framerate is not the issue - the MINIMUM framerate is the killer. 60 fps average is fine, but when the framerate drops to 10-15 fps in a heavy firefight, you're in trouble. A higher average framerate usually translates to a higher minimum as well. In fact, many sites have taken to quoting minimums as well, or even showing a complete framerate graph.

        - Third, the ability to manage 100 fps at e.g. 1024x768 means only around 40 fps at 1600x1200, if your monitor extends that far, or perhaps only 30 fps at 1024x768 with 4x AA if it doesn't. Your card will need to score 200 fps if you want to improve your resolution/AA, or maybe even 300 fps if you want to do that and still keep your minimum fps above 60.

        - Fourth, the same argument applies to other quality improvements like trilinear and anisotropic filtering. Taking 32 texture samples instead of 4 can really kill your framerate, so you better hope you're getting enormous framerates with non-anisotropic filtering if you hope to get acceptable speed with anisotropic filtering enabled.

        - Fifth, Q3A is not the only game out there. There are a lot of more demanding games available today, even those based on the Q3A engine like RtCW, that will give you much lower framerates.

        Combining two or more of the above factors can bring the fastest graphics card to its knees, even if it scores 200 fps in Q3A. We'll have to wait until we see scores of 300 or 400 before we can expect to play Jedi Knight II at 1600x1200 with 9x AA and 16-sample anisotropic filtering, while never dropping below at least 30 fps. But boy, will it look good when we can :-)

        Ideally, a review will give individual scores for all the above - high resolution, AA, anisotropic filtering, a range of modern games, and all combinations of the above. But since this would entail a vast amount of testing and a huge array of numbers, most reviews settle for a few known tests that are indicative of performance in other tests. And the most popular of those is good old Q3A.

        • Despite Q3A being a good benchmark it still isn't what people use their computer for all the time. It's best to look at a range of benchmarks for a card - or mainly at the ones for things you'll be doing most if the time to get a more rounded picture of how they perform relative to each other.
    • "In "simple" games like Quake III Arena, the Parhelia-512 will definitely lose out to the GeForce4 Ti 4600. In stressful DX8 games, Matrox expects the Parhelia-512 to take the gold - either performing on par or outperforming the GeForce4 Ti 4600."

      I am having a hard time getting excited about a video card that is not out yet that will be as good as a video card that has been out for months
    • You know, all this speculation is driving me nuts! It's useless to speculate whatever the Parhelia will kick nVidia ass or not untill we've seen some actual benchmarks.

      In theory you/they are correct, but what about the Parhelia in practical tests? That's what I'm waiting for.

      Think about it before you mod me down as a troll/flamebait/offtopic
  • by MacBoy ( 30701 ) on Tuesday May 14, 2002 @09:54AM (#3516828)
    I wonder if their new Parhelia can deliver on its promises? Have Matrox's openGL drivers improved significantly over the past few years? Poor openGL was what killed G200's promising future, and I would hate to see a repeat performance.
    • The G200 delivered beautifully on everything it promised. It allowed me to run 4 monitors. It wasn't a gamers chip, it was intended to help show more info than previously possible. Using the PCI version in Win2K with a special patch, I saw one PC that had 16 monitors attached. Amazing.
  • It's really nice to see all sorts of nice specs in a preview, but it doesn't really tell us much of anything about actual performance - especially gaming performance. People trust nVidia previews because nVidia has had a good run of high performing gaming video cards. Matrox can add a flux capacitor to their cards and still have a worthless gaming card. Specs and previews mean nothing. Final hardware and drivers are everything.
  • The 16 sample AA shown here looks nice and there's a bit of detail on a few of the features like the hardware displacement mapping. Very nice looking.

    http://www.hardwarezone.com/articles/articles.hw z? cid=3&aid=425&page=1
  • Well, it looks like Matrox may be back into the mainstream. To most consumers, they're an unknown. To techies, they're the little company that refused to die, and to businesses, they're the best supplier. We'll see which of those three items changes.
  • It looks like that there is another dog fighting over the same bone now. At last some real competition in the market.

    This is what is really needed in the industry. nVidia has and ATI has been the top dogs for a while and the new releases have been a little stale. Sure the GeForce 4's have been nice, but there are those out there who think that the GeForce 3's give better image quality. Then there's ATI and it's new Radeon 8500 128mb cards...it's just a 8500 with 2x the memory.

    Matrox entering the ring again with this new chip and it's abilities should rattle the windows for a bit and we'll see nVidia and ATI scrambling for the next gen cards to out perform Matrox.

    It's a competitive situation that promotes quality product for everyone.

    Now if only M$ would get the clue eh?
  • I hope it's better than their sorry excuse for a TV card that never worked right. It was a few years old, but it was an add-on board to one of their video cards(VFW), and the framerate changed wildly, if it worked at all.

  • by Vortran ( 253538 ) <aol_is_satan@hotmail.com> on Tuesday May 14, 2002 @10:07AM (#3516890) Homepage
    Ok.. so it has AGP 8x. Nifty! What motherboard do I buy that has AGP 8x? I just bought an Abit KR7A-RAID with Via KT266 chipset, thinking this is a pretty decent board, but I doubt it supports AGP 8x.

    Now we move on to monitors. Could someone recommend a monitor that I can use to accurately resolve 1 billion colors? I tend to run my 2 Viewsonic PT775's at 1600 x 1200 so I've grown accustomed to that much "real estate".

    This sounds like an awesome card, but I really don't know where to go or what to get to reap all the benefits of it.

    Lastly, precisely when and where can a fellow technogeek acquire one? Since the HotHardware site seems to be experiencing some serious "Slashdot Effect" I was unable to finish reading the entire article. MRP $$ and a release date would be very useful.

    Vortran out
    • MRP : Approximately 450 US Dollars.
      And this would be for the fastest Matrox Card. There might be more than one flavour, but dont expect them to diversify it like Nvidia does (Ti200, Ti4400, Ti4600 and the MX420 and MX440).

      Shipping Date : Late June
    • The AGP modes are usually backwards compatible, so if your board supports AGP 4x then it'll be fine.


    • If you build it they will come. When voodoo introduced 32 bit maps, then games were designed to support it. I'm sure with such a powerful card especially with regards to mapping games will jump to look more realistic which seems to be teh goal. My concern is still more price I dont feel like paying $300 and up for a card just to play video games I rather just get a PS2. I still think that computers are not suitable gaming platforms. It much more fun playin in my living room on my 40" TV with my friends than in my study on a 19" monitor.
    • RE: 8x AGP:

      I've been doing some pricing on a new machine for myself (of which I cannot afford yet), and some of the newer kt333 chipsets come with AGP 8x, as well as USB 2.0 etc.

      It's out there, but do we use it? not yet....
    • AGP 8x: It'll work in an AGP 4x motherboard. Motherboards with 8X AGP should be out soon.

      Monitors: Analog monitors (eg. the one you have today) can display an infinite number of colors. The DAC (digital-to-analog converter) on the graphics card creates the appropriate analog signal. The real question is whether digital DVI monitors will support more than 24 bits of color.

      Where: Matrox has a list [matrox.com], including their own online store [shopmatrox.com]. CDW [cdw.com] seems to carry most Matrox products.

      When: June.

      Price: $450 for the top-end, low-end was not specified anywhere I could see.

  • Seems to me that the hardware they use for running their website is not so hot. They're fried already.
  • Any one seen any pictures of the card and the connections possible. I would like to see a picture of the bundle and various connectors included.
  • Worried execs decided to announce the launch of the GeForce 5 [reserve.co.uk] later this year.

    I kid you not!!!
    • Extremetech checked out 3Dlabs offer instead:
      http://www.extremetech.com/article/0,339 6,s=1017&a =26271,00.asp
      Which in my eyes sounded a lot better than Matrox offer since it was much more general-purpose. But on the other hand Matrox knows what features are really needed, and the PS2 showed that general-purpose features won't get you anywhere if they are hard to use. Featurewise it's a draw, but they are two different kind of beasts.

      Extremetech also has a thorough discusson of the Matrix release:
      http://www.extremetech.com/article/0,3396,s=1017 &a =26865,00.asp

      And don't blame me if that site don't have persistant links.
  • Looks like a good card, but there's still no mention of support for non Windows, non x86 or non DirectX support.
    In some of the earlier 'previews' there was talk of OpenGL 2.0, which I'm sure this card will theoretically be compliant with (once the ARB settle on the specs of course). But what of support for Linux, BSD, OS X. Does the hardware support both big and little endian?
    It's fair that Matrox are pushing the DirectX 8.1 (and 9 no doubt) and Windows thing now, but when will we hear about other possibilities?
    • Matrox has provided some really good support in the form of driver software and forums for various flavors of Windows and Linux. I believe there is also some level of FreeBSD support. I would be surprised if this one didn't have non-Windows support. I can't speak for non-x86 support.
  • Toms Hardware also has a review of this card; however, it's not actual silicon -- he just reviews the spec sheets that Matrox has given them.

    www.tomshardware.com/graphic/02q2/020514/index.h tm l

    Beware -- I was just trying to get to the 3rd page in the review -- it appears to be getting slower..... ?
  • Specs VS G450 (Score:3, Interesting)

    by LoudMusic ( 199347 ) on Tuesday May 14, 2002 @10:17AM (#3516961)
    I'm currently running an AGP Matrox G450 with 32mb of RAM with two CRTs. I like the card because it allows me to go up to 3200 x 1200 resolution with 32bit color.

    I really like the prospect of having three monitors to eliviate the issue of having a giant gap between displays due to the thick boarder of any display. However ...

    This new card claims it only does 3840 x 1024 resolution on three cards. It still has the max color depth, but the resolution has to drop. By going to this big fancy new card I'd only gain 100,000 pixels, which in reality is next to nothing.

    Is it a driver limitation, or does it take more than a 512bit dual 400mhz 256mb video card to push 4800 x 1200 for simple 2D functions?

    ~LoudMusic
    • The problem is that, while the two internal DACs are 400 MHz, and each are capable of 2048 x 1536 x 32 @ 85 Hz, the third external DAC is (in standard Matrox style) only 230 MHz.

      Which means, if you want to run all monitors at the same res (required for "Surround Gaming", really), you're limited to the resolution of the external DAC, which probably struggles to do 1280 x 1024.

      It's nothing to do with the driver, and you can always add a second PCI gfx card for more monitors to get all the area you need. Try 5 x nVidia Quadro4 400NVS cards, each with 4 monitor outputs capable of 2048 x 1536, for a total of 61 million pixels - 16 times what you have now :-)

      • Ah, actually an informative answer. Thanks (:

        I could throw in a couple additional cards now and spread out the desktop - the reason I don't is because I like to have Windows think that it's all one display. It treats windows and the desktop differently, and the tasktray spans the entire desktop field. I've got a PCI GeForce2 MX that I could throw in and add another 2 monitors, but it can only handle 1280 x 1024 ... not bad, but I'd rather crank up the res while my eyes are still good (:

        ~LoudMusic
        • Well, most multi-monitor gfx vendors (including Matrox and nVidia) provide software to better manage your windows, dialogs etc - preventing them from popping up on the screen split, etc. I don't know if nVidia's nView (added to recent drivers) has all the features of Matrox's DualHead, but it seems to me that more space is still better :-)

          Not sure which GF2MX you have, but the ones I've seen certainly supported up to 2048x1536 on the primary monitor at least. They have a 350 MHz DAC, IIRC. And different resolution screens should be possible too, at least under XP/Me.

          • It's a 64mb MX 400 dual VGA. It does get that resolution for one monitor, but when it's running two CRTs, it can only handle 2560 x 1024. If I had smaller monitors, 17", I wouldn't mind. But with 19" monitors using a slightly lower resolution makes me feel like I'm jipping myself.

            ~LoudMusic
  • by cOdEgUru ( 181536 ) on Tuesday May 14, 2002 @10:17AM (#3516964) Homepage Journal
    This [aselabs.com] is the only picture I could find of Parhelia.

    Look at the massive heatsink on that baby... Ooooh mama...
  • by grmoc ( 57943 ) on Tuesday May 14, 2002 @10:17AM (#3516965)
    For those of you who don't already know, professional TV standards (specifically, D1, also known as SDI, though SDI is technically different) use 10-bit YCrCb video.
    This means that any particular pixel may have up to 30 bits of color (even though the maximum difference between colors of pixels is less than that.

    Obviously, this is not something that is easily accomplished with standard 24 bit/32 bit rendering. If you convert the SDI into something that can be represented in the frame buffer of the video card, then you've lost precision. This is unacceptable for broadcast! (And no, overlay isn't generally good enough since you want to capture the pixels for output though SDI)

    Admittedly, this card isn't perfect- It would be nice to have 8 bits of destination alpha (for a key channel). 4 shades of keying just isn't enough...

    In any case, having a card (finally!) support 10 bit rendering (especially the 10 bit rendering in openGL) in hardware will be wonderful!

    • by Namarrgon ( 105036 ) on Tuesday May 14, 2002 @12:35PM (#3517899) Homepage
      The Matrox's 10 bit framebuffer & DACs will be great for previewing deep-colour images, e.g. in film work and some broadcast. It's not the first (SGI do 12 bit, Sun's new card does 10 bit, so does the newly-announced P10 from 3Dlabs). But it's not a solution for professional video output.

      As someone else pointed out, 10 bits of RGB does not equate to 10 bits of YUV. The Parhelia will give great 10 bit RGB previews (completely independant of output quality), and will even output a 10 bit YUV video signal - but only via S-Video, where the two colour signals get encoded together anyway. You need 10 bit component output, or 10 bit SDI, neither of which can be done by the Parhelia. It's more aimed at the 10 bit DVD market than a professional output solution.

      The two-bit alpha limitation is largely irrelevant. For display on a monitor, RGB is all you need. Processing of deep-colour images should be done with at least 16 bits per component (including alpha) in memory for best results, then dithered down to 10 bit RGB for display. Key channel output requires a second video connector, so it won't do that at all.

    • That Matrox has all sorts of nice pictures to show off their 10-bit technology? But when you view it with your own video card, the most you'll get is 8-bit color. So what's the point of all the pretty pictures? Talk about the marketing folks not getting the point!
  • I've been using matrox products since the Millenium II (I'm on a G400 32mb now). Always rock solid 2D performance, and quality. Their 3D is usually a little different then everyone else's (i.e. environmental bump mapping), but solid. It's nice to see their going to be ahead of the curve in release the the next, next generation video card. I think this will give them a jump start in sales, for the gamer that hasn't used matrox before. The users of current Matrox cards will also be a huge market, as there customers are extremley loyal to them. All this goodness, and their still a Private company (and Canadian no less).
  • I was reading the Parhelia review at HardwareZone [hardwarezone.com] when the server chugged to a stop and I wondered, "I wonder if Slashdot just linked to the article?"

    Just when I thought that my workplace would never spring for a card with these features, up popped Page 6 [hardwarezone.com] (just ignore all those pictures of people playing games with the card) with Glyph Antialiasing for "business appeal!" Three monitors, here I come.

    • Info on the Glyph Anti-aliasing is here [matrox.com].

      Their edge-AA functionality would lend itself well to font rendering. It's debatable whether it'll help the speed or even quality of current Windows font rendering, but so long as you're not forced to use it, it can't hurt. The hardware gamma correction is good, and it does "de-gamma" the background before blending in the text (which should be done with linear data).

      My question is, does it correctly support hinting? It's not much use unless it does.

  • Over the years I bought the hype that Matrox was dishing. This is just more of the same. All would be fine, if they delivered on the **software**. It took them more than a year after Win2k was released to release a final driver for the G200-TV card. The beta driver sucked (*lots* of BSOD). Their forums were clogged by people complaining about the lack of Win2k support. Either their driver group is incompetent, or Matrox corporate had other priorities. Either way, the situation sucks.

    I think the problem is the graphics card biz is a low margin business, and the first thing they skimp on is software. *Sigh*

    Also, they're so cheap their site is completely /.'d right now.
    • by Anonymous Coward
      I got screwed by the same issue - The Matrox story was that Microsoft broke VfW in 2000, and there wasn't a direct show replacement for the chipset. After a year of bluescreening betas, they gave up. (I gave up because I added a second cpu, which would be a complete no-go, even if they got it working.)

      Note that the capture cards weren't advertised as 2000 compatible, and the driver was sort of a 'best effort' by Matrox (an effort which failed). The cards worked fine in 98, which is what it said on the box.

      And then CPUs/ATA disks got fast enough that you could do software capture with better results without futzing with some hacky low-end MJPEG board like the Rainbow Runner (which itself works OK in software mode).

      The lesson I learned wasn't necessary that Matrox SuX0rs D00d, but instead that (1) video capture on Windows sux0rs in general because the lack of standardized APIs, and (2) If you are serious about vidcap, it's better to spec out a semi-pro system (like the Matrox RT card), get it working, and never upgrade your OS/hardware just because it would be cool to do so. If you aren't serious, get a cheeze software capture card (WinTV or ATI or something) and prepare for driver hell.
  • Hopefully Matrox will discontinue the DualHead, TripleHead, etc., naming conventions before they get to the sixth generation (for the same reason that Intel didn't release a Sextium).
  • I've used Matrox cads before and while I can't fault the 2D image quality, i've had trouble with drivers especially OpenGL ones.
  • by edgrale ( 216858 ) on Tuesday May 14, 2002 @10:47AM (#3517178)
    Take a look at this explanation [clara.co.uk] which explains what a parhelia is =)

    interesting stuff
  • Parhelia 512? Awfully strange name for a new Matrix film.
  • Does anyone have a clue what the Linux drivers will be like? Open source? Will they support the "surround gaming" features?

    This card looks really sweet, and Linux could really use some competition to NVIDIA in the 3d card market, I hope the Linux drivers are up to par.

    If they're binary only, I hope they put as much effort into them as NVIDIA does.

  • Matrox has a history of abandoning large sections of their users. They left owners of the Motion JPEG hardware high and dry when they decided that it was too difficult to get the hardware working correctly, and that it was better to run it without the hardware acceleration. Those who had spent hundreds for hardware-accelerated video recording were left with a system that was comparable to ones available for $30 or $40.
  • The Tech Report with their in depth preview http://www.tech-report.com/etc/2002q2/parhelia/ind ex.x?pg=1 [tech-report.com]
  • by Shivetya ( 243324 ) on Tuesday May 14, 2002 @11:53AM (#3517630) Homepage Journal
    Matrox doesn't actually have a good history of getting cards out in a decent time frame. Figure that by the time this card is actually available (anyone remember the g400? how many months did it take to get one after it supposedly became available?) it will be irrelevant.

    The next problem is that Matrox ruined their reputation in my eyes with the G200 by lieing about OpenGL. Lieing about how they were going to have it in November, then December, and so on... they kept this up until they announced the G400 and then suddenly the g200 was a no-go.

    Ever since the G400 series it seems Matrox has been coming up with feature laden cards... trouble was no one asked for the features they chose to offer. Now they added even more features and a buttload of performance to boot. Yet as before, GF5 will be announced about the time this card is supposed to ship, and most likely be in stores at the same time.

  • I wasn't able to find any info from the hothardware or matrox sites. Any rumors as to when this is coming out, and how much it's going to cost?
  • I demand Nerdity points for counting down last night the hours (minutes) until the NDA ran out.

    And sending all of my friends half-hourly updates. :)

    Unfortunatly I did have to go to bed and was thus unable to be the first non-NDA'd person to read the previews.

    Product name too hard to spell, help!

    I will just keep on refering to it as the G1000, SOOO much easier, heh.

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...