Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software

Matrox's New Three-Head Video Card 312

This Anonymous Coward was one of many readers to point to sites with information on Matrox's upcoming Parhelia-512 graphics card: "It appears that some foreign hardware sites have violated NDA and posted some very juicy details on Matrox's next generation hardware. iXBT's review can be found here(1), and a MURC posting with some other pics from China can be found here (2). It looks like the real deal. Will Matrox wake up from their long slumber in the 3D gaming market, or will this card be another stopgap like the G550 was?" Update: 05/12 14:07 GMT by T : Alexander Medvedev of ixbt.com points to the English version now online as well, and notes : "Please note, we can't violate NDA becouse we _do _not _sign _anything with Matrox Graphics. And never receive any info from Matrox."
This discussion has been archived. No new comments can be posted.

Matrox's New Three-Head Video Card

Comments Filter:
  • Gossip (Score:2, Funny)

    by dirvish ( 574948 )
    "some very juicy details"??? Did the video card sleep with its cousin or something?
    • Re:Gossip (Score:3, Funny)

      by Toraz Chryx ( 467835 )
      No, it locked a Geforce 4 Ti4600's in a box and fed them only the bare minimum to keep them alive.

      for sport.

      whilst rendering Jedi Knight II at 3840x1024 / 32bit at 100FPS.
  • Mmmmm.... (Score:3, Funny)

    by DanThe1Man ( 46872 ) on Sunday May 12, 2002 @04:56AM (#3505213)
    Surround video Pr0n...
    • Sorry, but please tell my exactly why this was off-topic. "Surround video pr0n" is perfectly on-topic - it's definitely a possibility with this new video card. JoeAverage with his BestBuy PC with 1 AGP port and 1 PCI port would never have been able to watch surround video pr0n on 3 screens before. This post was as on-topic as someone pointing out that the latest version of Open Office can edit word-art created in Word XP (not that I'm saying it can).
  • Parhelia 512 (Score:4, Informative)

    by sxpert ( 139117 ) on Sunday May 12, 2002 @05:03AM (#3505220)
    It seems this thing will kick major (major) major ass...
    The thing works with 10 bit resolution
    has 5 outputs, and 2 display controllers (dunno how they will have 3 monitors attached)
    There is a glyph antialiasing unit (ahem...)
    DVD/HDTV decoder (10 bits) and also a 10 bit video digital interface.

    more info at
    http://ixbt.com/video2/parhelia512/chip_diagr.jpg
  • Since its in russian (Score:3, Informative)

    by MrBandersnatch ( 544818 ) on Sunday May 12, 2002 @05:07AM (#3505233)
    The only things that I can make out are :-

    1) It has some VERY fancy graphics tricks up its sleave.
    2) 16xFSAA
    3) 3 Head support for "surround gaming" which appears to be supported by the driver rather than requiring the developers to support it.
    4) Support for 256mb of DDR ram.

    The rumours I've head about this card are that its also FAST enough to give nvida some competition. Hopefully tom will have some benchmarks soon :)

    • It doesn't have 16x Full Scene AntiAliasing, it has 16x FAA (Fragment AntiAliasing)

      From what I can gather it only AA's parts of the image that actually NEED it, the fillrate savings must be enormous :)
  • Also here (Score:2, Informative)

    by The_Shadows ( 255371 )
    Link [techimo.com]

    If the details are accurate, this could beat out the GeForce4 and bring Matrox back into the gaming market. Unlike the 550 [neoseeker.com]. Decent card, but 0 gaming potential.

    The_Shadows, out.
    • I think they must have realized that for a video card to be useful now, powerful 3D is the foundation. More and more applications are using 3D API's and with Longhorn, and of course 3D animation, modeling, design, and architecture programs not to mention GAMES a video card is just no good unless it has some serious 3D muscle. Bring programability in the 3D chips into play and the trend continues. Unless someone is doing straight photoshop or video editing, I don't know who would use a triple head card without 3D in it.

      I think Matrox realizes this and can't imagine that they would continue to compete in video cards without some big ol' 3D firepower.
      • Re:Also here (Score:2, Insightful)

        by linzeal ( 197905 )
        It has 10 bit ramdacs, nothing close to that has come to the consumer market. The 2D has been raised a notch or two and NVIDA and ATI both have no real way of countering the 2D performance at this stage. The 3D is going to be good perhaps better than ATI and Nvidia but Matrox has like 2-3 year product cycles, they can't can't compete forever on 3D performance unless they restructure their company and change their engineering philsophy.
      • Spoken like a person who has no idea how computers are used in the real world. Yes, there are programs that use 3D, including modelling and games. They continue to form a very small part of the market.

        In evidence I present the Matrox G200 MMS [matrox.com]; a four-head video card based on the marginally 3D-capable G200 chip. Matrox sells these by the bucketload into businesses like finance, who give some value to a card that can present four screens of 2D information. I also present the 10 Top Selling Games of 2001 [pcstrategygamer.com]. There's exactly zero games in there that can't be played on a G400, and one that would like a more powerful card (Black and White).

        Yes, Matrox realizes that they cannot compete in the high-end 3D gaming market with the G550. What you need to realize is that most of the computer users in the world don't need any3D, let alone more than what a G400 can deliver.
        • Most computer users don't need 3D, most computer users don't need 3 fuckin' monitors eighther. Of the people who are using three monitors to do their work, I am betting a good lot of them are doing something with 3D some of the time.

          A 10 bit DAC isn't that big of a deal. It is cool, but 3D labs' card will have one and I am sure ATI and Nvidia will step up to the plate too.
          • Most computer users don't need 3D, most computer users don't need 3 fuckin' monitors eighther. Of the people who are using three monitors to do their work, I am betting a good lot of them are doing something with 3D some of the time.

            Speaking from personal experience, I can assure you that you're completely wrong. I've seen countless multi-headed setups in a number of companies I've worked for (and indeed, I have a dual headed setup myself in my current job). Not one of those was ever used for 3D. They're used exclusively for 2D, for displaying more information than fits on one screen. Mine shows the status of all the production machines for which I'm responsible, for example, while still leaving enough real estate for me to get my work done. In the banking world, they're used to display market prices, etc.

  • VERY low FPS (Score:3, Interesting)

    by GutBomb ( 541585 ) on Sunday May 12, 2002 @05:14AM (#3505250) Homepage
    Look closely at the lower right corner of this screenshot. This is the one without the 16xFAA too. Pretty disappointing really.

    Here is the screenshot [ixbt.com] from 3DMark 2001.

    For those too lazy to look it shows a paltry 3 FPS.
  • Specs (Score:2, Informative)

    by joe_fish ( 6037 )
    The links are /.ed and in .ru so not much help. Before /. posted I noticed the following rumored specs:

    512bit GPU

    tripple head

    20GB/s memory bandwidth

    256bit DDR(?) memory

    bump mapped surfaces

  • by joe_fish ( 6037 ) on Sunday May 12, 2002 @05:23AM (#3505267) Homepage Journal
    The register has more details (and isn't dead):
    Go http://www.theregister.co.uk/content/54/25238.html [theregister.co.uk]
  • by SHiFTY1000 ( 522432 ) on Sunday May 12, 2002 @05:29AM (#3505276) Homepage
    Specification:

    80 million transistors
    Technological standard 0.15 gm
    Graphic nucleus/kernel and memory work with the clock frequency up to 350 MHz
    Valuable is 256 bits (!) OF DDR the busbar/tire of the memory
    The capacity of local memory on the order of 20 GB/sek
    Capacity of local storage 64/128/256 MB.
    AGP of 2kh/yakh/8kh including regimes/conditions SBA and FastShrites
    4 piksel'nykh conveyors
    4 textural blocks on each conveyor (!)
    To fillrate:up to 1.4 gigas-peaktorrent and up to 5.6 gigas-flowtorrent
    Apical sheydery of version 2.0 (Vertekh Syuader 2.0), four parallel fulfilling blocks
    Piksel'nye sheydery of the version of 1.3 (Pikhel Of syuader 1.3), 4 textural + 5 combination stages on each piksel'nom conveyor, with the possibility of the association/unification of conveyors in pairs (we obtain 2 conveyors on 10 combination stages)
    YEMBM and DOTE the imposition of the relief
    Fixed/recorded T & L DKH8 (including the extended possibilities of matrix blendinga and skininga).Is actually special apical sheyder
    Construction, storage in the local memory and conclusion/derivation to the monitor of image with the accuracy of 10 bits to the component of color (!). the technology of the 10- bits Of gigaCholor
    Two built in the chip, 400 MHz, 10 bits to the channel RAMDACH, which use technology Of ultraSyuarp
    Valuable of 10 bits.> 10 bits are tables for the arbitrary Gamma-korrekqii the concluded image
    DVD and YUDTV of video decoder with the accuracy (at the output/yield) of 10 bits
    Is supported the conclusion of image in the permissions/resolutions up to 20ya8khyshche'khe2bpp8shch Hz
    Built-in the chip interface of TV -Out with 10 bit accuracy signal shaping
    Two digital TDMS of interface for the digital outputs/yields or external RAMDACH.Is supported permission/resolution up to y920khy200khe2bpp
    Two*** TRANSLATION ENDS HERE ***



    There was also a mention of glyph antialiasing... And 64 / 128 bit per pixel colour...

  • I used to have a G400 and still use it. It used to be for my desktop and games, but I moved it to my Linux server. Still an awesome card.

  • mirror (Score:2, Informative)

    by flex941 ( 521675 )
    .ru link mirror here [www.city.ee].
  • by Lejade ( 31993 )
    What I really would like to know is if Matrox is going to release a Free Software driver for this thing. If I remember correctly, they used to have a pretty friendly attitude towards GNU/Linux and Free Software. I really hope this hasn't changed as I'm sick and tired of Nvidia's proprietary crap and it would be nice to finally have good hardware acceleration on a Free Software system.
    • I'll take unified (preferably open source) drivers over Free Software drivers this time. My G400 was already obsolete for a good year or two when semi usable free software drivers for it emerged. The reality is that people capable of writing good free drivers are in short supply.

      I'd rather have all Matrox driver engineers slave away at good drivers for my Linux box than one or two (unpaid?) engineers volunteering their spare time. Anything else would basically mean wasting good money on hardware that I can't really use.

      -adnans (not an NVidia pimp :)
      • Nvidia good drivers? MY geforce2 crashes on just about every driver that Nvidia puts out. I for one don't feel their efforts to provide drivers are anywhere near being good. fair maybe but not good. good would mean my shit doesn't crash while browsing slashdot.
        • Thats not a nvidia driver problem.

          The linux 2880 and win2832 drivers are rock solid. The only blue screens Ive had (and rare) are memory, dma or a panic. X has crashed more due to KDE apps than nivida drivers.

          • well. I've gotten the same crashes using all manner of software (gnome/kde/X) in all manner of window managers. I still have to reboot to retain control of my system and return to the X drivers
          • Bullshit. nvidia's drivers have caused crashes on my machine, and machines of everyone else I know who uses nvidia cards with linux. Just because it may be stable for *you* doesn't mean it is for everyone. For *me*, it crashes, occasionally locking up the entire system, and leaks memory constantly.
    • They have very good support for Linux (I have my G550 working better than under windows (no crashes ;-))
      the dual screen also works like a champ.

      Kudos, Matrox
    • Matrox has been traditionally very friendly to linux. They produce Their own set of drivers [matrox.com] (with source!) as well as a tool called "powerdesk" which lets you switch X resolutions on the fly. In addition, they are very friendly to developers [matrox.com].

      All of my video card money goes to Matrox. End of story. I won't waste my time with these half-assed, hostile companies like ATI and Nvidia. (Yeah, so I'm a little behind the curve on this whole "3D game" thing...;) I'm extremely happy that Matrox will produce a good 3D card! I have owned the Millenium II, Marvel G200, and G450 dualhead and been extremely happy with them all. Now I'm just waiting for the Marvel G1000 [sourceforge.net]!

      --Bob

  • Three-headed? (Score:2, Insightful)

    by Rolo Tomasi ( 538414 )
    Look behind you! A three-headed monkey!

    Alright, now something on-topic: do any games support more than one monitor? I remember F/A-18 for the Mac could make use of three monitors, one for the front view and one for the left and right views each. This greatly increased the feeling of realism, and was especially useful during dogfights.

    I suppose flight simulations and racing games would profit most from this.

  • Damn... (Score:3, Funny)

    by BlueLightning ( 442320 ) on Sunday May 12, 2002 @06:25AM (#3505349) Homepage Journal
    Now I have to have *three* monitors hooked up to my PC be l33t? I can barely afford two...
  • The third monitor (Score:2, Informative)

    by Azahar ( 113797 )
    I can read Russian although I am not fluent.

    All three cards only support two monitors. The Parhelion can support up to three monitors "using stretching condition" "(2 (+ rezhim rastyazheniya na 3 ekrana))" whatever that may mean. It doesn't say how the third monitor is added. I suspect that a third RAMDAC is required which implies a daughter card.

  • Theres already a fairly indepth technical discussion about this over at Beyond3D [64.246.22.60], with many translations from the Russian text.
  • by Kiwi ( 5214 ) on Sunday May 12, 2002 @07:31AM (#3505447) Homepage Journal
    It is good to see that, when a major new video card comes out, Linux support for the card is a given. Just four years ago, the prominent manufactor of video chips for Laptops, Neomagic, had a very [google.com] hostile [google.com] policy towards Linux users, which results in problems to this day [samiam.org].

    Neomagic, however, eventually learned the folly of having an anti-Linux policy, and were forced to leave the Laptop chipset market altogether; I am sure that the various laptop makers did not appreciate all of the returns from people who wanted to use Linux. In fact, NeoMagic's support web page [neomagic.com] srill prominently discusses Linux drivers.

    - Sam

    • by tzanger ( 1575 ) on Sunday May 12, 2002 @07:58AM (#3505487) Homepage

      Neomagic, however, eventually learned the folly of having an anti-Linux policy, and were forced to leave the Laptop chipset market altogether; I am sure that the various laptop makers did not appreciate all of the returns from people who wanted to use Linux.

      You want to back that up even just a little bit? I love Linux and I'm running it on this Compaq EVO N160 but to think that a video *chipset* manufacturer even sees laptop returns due to Linux is absurd. In fact if you look at the page you provided, the drivers are done by Precision Insight; Someone over there probably talked them down into allowing source release, not hordes of Linux users who demanded their money back from the laptop vendors, who got so upset that they called Neomagic.

      "follow of having an anti-Linux policy" -- geez do you believe the crap you write? Yes Linux is great in servers and it's making headway (very great headway) in the desktop market but it hasn't got clout like you try to attribute to it.

    • Matrox' Linux support has been almost meaningless since the g450 because tv-out is not fully supported so you cannot watch accelerated movies on your tv. Basicly it sucks. I had to trade in my g450 for a g400 to get tv-out working and won't buy anything from them until things work like they should.
    • by sheldon ( 2322 ) on Sunday May 12, 2002 @11:48AM (#3506238)
      "Neomagic, however, eventually learned the folly of having an anti-Linux policy, and were forced to leave the Laptop chipset market altogether; I am sure that the various laptop makers did not appreciate all of the returns from people who wanted to use Linux. "

      No. It was ATI and nVidia making mobile editions of their chipsets that pushed NeoMagic out the door.

      Linux is far less relevant than you desire it to be.

  • The big question (Score:4, Interesting)

    by Snowfox ( 34467 ) <snowfox@sMONETnowfox.net minus painter> on Sunday May 12, 2002 @07:43AM (#3505463) Homepage
    This is all exceptionally cool. If it's not a hoax, the specs are nothing short of amazing. I've been a big Matrox fan from day one, and it's hurt to see them relegated to a tiny side booth on the floor at GDC and Siggraph.

    I hope they're getting ready to make an explosion at Siggraph this year. :)

    But the big question is - where the hell did all this come from? Did they hire a few people away from nvidia? Did they run across some brand new wunderkind? Or is this what happens when you shelve new product development for a few years and focus on delivering a new product three years, not six months from now?

    Is the architecture modular and well-designed enough that Matrox can continue to compete when the other guys catch up?

    And is Matrox (I hope) back?

    • Did they hire a few people away from nvidia? Did they run across some brand new winderkind?

      They musta' bought out BitBoys - with their Exciting!, New!, Glaze3D technology. Or perhaps they found the secert fountain of Amiga technology. Or something.

  • Three heads (Score:4, Insightful)

    by Snowfox ( 34467 ) <snowfox@sMONETnowfox.net minus painter> on Sunday May 12, 2002 @07:47AM (#3505472) Homepage
    Matrox have been trying to push the three head idea for a while now. A few games even support using two and three heads, putting extra stats and controls on a side screen, or even extending the game enough to give you a view in your peripheral vision.

    They have a few screenshots of different games which they've tricked into supporting it at the first link above. And I have to admit - it makes me a little drooly. :)

    It's also a brilliant move for Matrox: If they keep throwing out 3 head cards at a premium price - after buying one Matrox 3-headed card, who's not going to keep purchasing Matrox cards? If you got this set-up, would you disable one or two of your gaming monitors just to get the new nvidia whizbang that might be 10-20% faster?

    • With 2 PCI ATI or GF4MX cards, you can get 2, 3 or 4 monitor support. Cheaper/faster and quite likely better 3D drivers than Matrox, based on past experience.
    • Actually, forget the three-headed card - gimme the four-assed Mephisto Special card instead :-)
  • and waited YEARS for an OpenGL driver, you'll understand my reservations about Matrox.

    They promised an OpenGL driver before the card launched, but it was something like 2 years later before they finally got a crippled one out.

    By that time, they had the G400 out, and it could do it (with somewhat reasonable framerates), so to me it looked like they fixed a few hardware issues.

    For that reason alone, I won't go with Matrox anymore.
    • ...or the M3D add-one for the Millenium series before it, which only ever got as far as a miniGL driver for Quake 2 written for it, before being consigned to the bin marked "Technically Superior, Shame About Support".

      I feel your pain.

  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Sunday May 12, 2002 @08:37AM (#3505561)
    Comment removed based on user account deletion
  • by be-fan ( 61476 )
    Perhelia? Sounds painful...
  • There seems to be a growing trend to test market and trawl for ideas by supposedly leaking information.

    I'm sure many /.'er could make up better kick ass specs. And hey, with CPUs being made with FPGAs (see recent /. article) certainly we are getting closer top homebrew graphic chips, too.
  • by Sarin ( 112173 ) on Sunday May 12, 2002 @10:12AM (#3505862) Homepage Journal
    I've been using matrox dualhead videocards for quite some time now on my studio machines. It's really handy for extending the cubase desktop on various screens, because there's too much information for one screen.
    It's very important that the computer runs rocksolid for high end audio-applications, even the chip-set on the motherboard can be the cause of a lot of problems, that won't show up during non-audio applications. Luckely the matrox drivers a really solid, compared to other brands.
    I was just ready to order a non-agp dualhead card in order to attach more screens to this machine, but now this card shows up and I will deffinately buy it as soon as it comes out.
  • I'll be buying one instantly! Otherwise I think I'll buy G450 instead.

    I'll not buy a card that needs active cooling - my PC is noisy enough already, even though it is pretty quiet - I want my PC to be totally quiet!

  • 10 bit DAC (Score:2, Insightful)

    by chafey ( 108827 )
    The Matrox cards have always been considered the top of the line as far as image quality goes. Sure their 3D performance lags behind nVidia and ATI, but the images are very crisp and sharp - something very important to those who do non gaming work at high resolutions such as programmers and graphic artists. The 10 bit DAC is very important - it allows you to calibrate the gamma of your display without losing colors. After gamma calibration, an 8 bit DAC will not allow you to use all 8 bits for displaying colors. This means you will not be able to see all 16 million colors for a 24 bit image. The matrox's 10 bit DAC doesn't have this limitation and will always display all 16 million colors even with gamma correction.
  • I've often heard tell in the past that Matrox has better image quality in 2D (which is what I work in). Is this still the case?
  • Way to go Matrox. (Score:2, Interesting)

    I've always liked Matrox cards, ever since the MGA and Millenium. They usaully have rock solid drivers, support for alternate OS's, and the best 2D picture on the market. Although the G450/550 might not be as fast as Nvidia/Ati offering, they only cost $100 or so. The features and quality of these card blows away anything that the the other guys offer at twice the price. Matrox dual monitor and TV out is way ahead, and the "headcasting" is very cool. As for XP driver support, I havent seen a problem with matrox cards yet. But every system is different and I have seen alot of XP driver problems in general. You could always try the WIN2k drivers. The only bummer about this card is the $500 price tag. Oh well.
  • For the past year I've been using a Matrox G200 MMS Quad [matrox.com] card. (yeah, it's PCI and only 8mb * 4, but all I do is code.) At the moment I've got four matched IBM digital flat panels [ibm.com] hooked up. (note, this doesn't inspire too many kind words from my coworkers!) I like it, however, #1 doesn't get much use. In Win2k, I have the taskbar in #3, and so my vision is centered, with #2 and #4 peripheral. I'm actually seriously considering removing the #1 panel!

    What I'd really like to see from Matrox, and the industry, is some improving on the 1280*1024 @ 85hz limit for DVI spec [ddwg.org]. AFAIK you can't get digital output from a card to a flat panel monitor at higher than this resolution. (IBM's top end flat panel [ibm.com] uses all four channels of a modified G200 to get digital to it's 2560*2048 model.)

    If Matrox's new card will support a high res digital panels on an agp interface, I'll be first in line to pick one up.
  • Speed? (Score:3, Interesting)

    by Namarrgon ( 105036 ) on Sunday May 12, 2002 @03:36PM (#3507100) Homepage
    This thing looks great on paper - depth-adpative displacement mapping, and enough vertex shaders to deal with the resulting critical mass of triangles. Quad texturing on each of four pipes, and the requisite 256 bit DDR memory bus to keep it fed. And all running at 350 MHz... sounds like a monster - but there's a couple of significant gotchas raised by the Digit-Life [digit-life.com] translation.

    First, that massive 20 GB/s of bandwidth is going to be needed, every bit of it. There is no bandwidth-saving logic on the chip at all, unlike ATI & nVidia's latest. Since occlusion detection can make a significant difference [pcstats.com], and Z compression & fast Z clear also help a great deal (ATI claims [anandtech.com] their 8.8 GB/s performs like a 12+ GB/s system, a 36% boost), the Parhelia could be considered to have only 55% more bandwidth than a GF4 Ti4600 instead of 110%. If the next-gen offerings from ATI & nVidia have similar memory specs, the Parhelia could be at a significant disadvantage almost as soon as it comes out.

    Second, the Digit-Life article mentions that early scores (from very raw drivers) show a mere 20-30% increase in scores over a Ti4600. Now admittedly this should increase, but Matrox are not known for their 3D driver optimisations, and nVidia are. A unified driver architecture will give you a head start right out of the gate, as you can take some advantage of previous optimisations immediately, whereas Matrox will have more work in front of them to get their drivers performing near the potential of the hardware. Look at ATI; it took them 6 months of focussed effort (and the odd quality hack [tech-report.com] along the way) to get their drivers up to scratch. Matrox have not traditionally given their 3D side or their software side as much attention, in my experience.

    To me, while the triple-head feature could be useful to some (though I dislike external DACs - it's difficult to sync them closely to internal DACs, causing monitor beats), the 10 bit colour is to be applauded, and the vertex handling sounds very nice, anyone looking for performance would be better advised to wait for R300 and NV30.

    On a slightly different note, was anyone else disappointed by the quality of the 16x AA screenshots? I expected more. The edge-only AA feature sounds like a very good idea (though it will not help alpha textures, just like multisampled implementations), but I'm a bit jaded after the miracles promised by ATI's SmoothVision didn't exactly set the world on fire. Guess we'll have to wait for performance figures.

    Also, I wonder what their yields will be like. 80 million transistors on a 0.15 micron process sounds like something that's difficult to do cheaply.

You are in a maze of little twisting passages, all different.

Working...