Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software

Matrox G550 Killer Video Conferencing Featureset? 49

Chick writes "Chick's Hardware have posted their thoughts and analysis of the specifications of Matrox's upcoming G550 chipset, just what Matrox are and have been up to?" Basically the article proposes that the features "GoIP" and "Headcasting" combine to be a technique for high speed transmission of wireframe heads for teleconferencing. It all seems reasonable, but it is all speculative. And even if that is the actual idea, would it look good?
This discussion has been archived. No new comments can be posted.

Matrox G550 Killer Video Conferencing Featureset?

Comments Filter:
  • by Anonymous Coward

    A high-end workstation GPU, but not a gamers chip by any means.

    Matrox will have their next gen gaming chip out in under a year - but what will NVidia, ATI, etc have then?
  • by Anonymous Coward
    I'm often amazed at the marketability of things that appeal to people's comfort zone. I can see it now... the pointy haired boss who thinks this is a wonderful thing. Somehow the act of looking at each other makes a meeting more effective. I missed the logic in that one.

    These sort of things tend to appeal to people who are inept with written communication. The person who wants this is probably the same person who thinks it would be great if their mail client could just automatically print out all their email so they didn't have to manually do it themselves. Nevermind that the whole idea of having a digital version is to avoid paper, but I digress.

    At least we can expect the porn industry to make good use of it. Possibly the gaming industry will come up with a good use, but in the business world this will be a collosal waste.

  • As a point of clarification: You do not have to use the Java client. There are native clients for Windows 9x/NT/2000/CE, Linux, Mac, and other OS's. The clients are small and compact (the Windows client is approx. 200kB and is just a single .EXE file). The Java client is handy if you are trapped behind a machine that does not allow you to run arbitrary programs, but gives you a web browser (such as most web kiosk type machines).
  • You bring up some good points, however if the speculation of the article is anywhere near close to reality then I doubt if the subtle communication issues will come through. I'm probably a little pessimistic about these things. Thanks for the grammar correction. :)
  • by warmcat ( 3545 ) on Tuesday May 29, 2001 @03:43AM (#191548)
    Hollywood has been casting by head for years :)
  • by Sethb ( 9355 ) <bokelman@outlook.com> on Tuesday May 29, 2001 @05:26AM (#191549)
    There's more realistic-sounding information available on this site:

    http://www.xbitlabs.com/news/#990980645

    Aw hell, I'll just post it here:

    Although Matrox does its best to keep all the details about the upcoming G550 top secret and doesn't provide any info even under NDAs, they still can't withstand very unpleasant leaks. Our reliable sources close to Matrox told us some credible details about G550, which will be officially announced on June 19, as we have told you before.

    Judging by the set of features the upcoming G550 will have, we can conclude that this solution will not be targeted for gamers. Matrox will pay special attention to DualHead technology and 2D graphics, while 3D performance will remain mediocre. So, G550 seems to be just an enhanced version of the G450.

    As for 2D, just as in the previous solution, G550 core will have 2 RAMDACs. Primary RAMDAC will work at 360MHz, while the secondary RAMDAC will work at 230MHz. So, the highest resolution on the primary monitor will make 2048x1536 and on the secondary monitor - 1600x1200, at 32bit color mode.

    As for DualHead technology, it will work in the following modes: multi-display, DVDMax, zoom, clone, TV-Out, Snapshot, and will feature eDualHead browser enhancements. Also G550 will support dual-DVI.

    Speaking about 3D, G550 architecture will undergo some slight changes. G550 will have 2 pixel pipelines with 2 texturing unit each. However, the performance will be greatly limited by 64bit DDR SDRAM, which Matrox considered to be the best choice for its G550 for some reason. So, 3D performance of G550 will be comparable with GeForce2 MX 200 or RADEON VE or even lower.
    But G550 3D core will also have some nice things. It will be DirectX 8 compliant due to Matrox's brand HeadCasting engine. This engine is none other but a T&L unit supporting extended Vertex Shaders and matrix palette skinning. G550 will also support EMBM.

    G550 will be made with 0.18micron technology and have dual integrated CRTCs, RAMDACs, TMDS and one integrated TV encoder. The cards based on G550 will be equipped with 16MB or 32MB DDR SDRAM with 64bit interface. G550 based cards with 32MB memory are expected to cost around $140.

    Also our sources told us that 3D gamers shouldn't forget about Matrox, because the company also keeps working on a new gaming solution.
    ---
  • Can't comment on that either, because I have TV out on my second head, and am using a G200 PCI for my second monitor. This resolves a W2K problem with mishandling duel head cards.
    --
  • Thanks for the info. I'd love a fancy new game card, but it's a combo work/home box, so the 80/20 rule once again applies.
    --
  • Urk, apparently the 2D Nvidia situation is so bleak that people have taken to modding their cards (http://www.geocities.com/porotuner/). On the other hand, I did find some ancedotal evidence that Visiontek make is better than others.
    --
  • The G400 is generally considered to be the 'high-end quality' card and the G450 the 'cheap mass produced' card.

    I have a G400Max on a 21" IBM flattube monitor , and 2D image quality is excellent (and it also was arguably the fastest 3D card when I bought it). I've never seen a G450, so no comment there.

    The only downside to Matrox is that they are verrry slow to get new drivers out. The regular driver set for Win2K didn't become really good until a year past the OS release. Meanwhile, since I picked up a RainbowRunner G capture board, I'm stuck with beta drivers that are more than a year old (and they supposedly are actually working on on them..)

    Does anyone know of a 2D image quality comparison between different GeForce makes? I've heard they vary wildly depending on the integrator, but nothing is published but quake numbers on the big hardware sites.
    --
  • Will they stop someday ?

    BTW, there's something strange happening, I couldn't post this without adding something to "Will they stop someday ?". Here's what I got:

    Something is wrong: parent=44 dups=1 discussion=01/05/29/0831233
    Duplicate. Did you submit twice?
    Let us know if anything exceptionally strange happens

    Oh, and now I have to avoid the lameness filter. So here's a fortune:

    Bumper sticker:
    "All the parts falling off this car are of the very finest British manufacture"
  • If GoIP and HeadCasting are real (there's no indication that he didn't made this stuff up too), a much more reasonable assumption is that GoIP indeed means Graphics over IP, and HeadCasting is transmitting framebuffer information to a remote display over IP.

    Not convinced?

    Use your head (no pun intended)... VoIP, GoIP... multicasting, headcasting... and how is matrox calling their dual output feature? DualHead.

    Actually, something like this would be quite useful if done in a generic hardware way. Something like VNC in hardware, so to say. Just stick this matrox with the GoIP module here, and stick that monitor on the other side of campus with an apropriate GoIP module, and let them talk over the existing ethernet. Or just fill the campus with monitors all tuned to one of those boards and use it as an announcement/news system that is significantly easier to expand than doing runs of coax to a bunch of IP-challenged display units.

    I can certainly find applications for something like that.

    Obviously, i just pulled this out of my arse just like the author of the original non-story, however I'd like to believe that at least I'm being more reasonable :)
  • I'd be highly surprised if "Headcasting" had anything to do with videoconferencing wireframe heads

    I agree, I don't think Matrox is doing this either. But come to think of it, it would be kind of a neat way to do teleconferencing--have the software use a webcam to make a 'skin' of each person's face, once, send that once at connect time, and then map it onto a wireframe model and just send info on how to animate the model, instead of video packets. This could be much lower bandwidth (read: higher frame rate / smoother animation) than actual video, and quite engaging in a cartoony sort of way. Has anyone attempted such a thing?

  • It will be DirectX 8 compliant due to Matrox's brand HeadCasting engine. This engine is none other but a T&L unit supporting extended Vertex Shaders and matrix palette skinning. G550 will also support EMBM.

    Y'know, this is why Slashdot works. The editors toss stuff out, some good, some bad, and the community here tears it apart, researches, and comes back with good, useful information.

    This is the useful aspect of /.; for all the bitching people here do about "editorial this and that" and "Taco can't spell", the *real* meat of /. is in the intelligent users.

    You hit a "dumb" question about, say, "how do I carry my gadgets", and you wind up with 15 to 25 good suggestions, from the solid to the oddball concepts. For all that people bitch about the "editors" and "writers" here, it's us... maybe we don't edit (moderate) the root stories that spawn all the comments, but we write and edit 99% of the content on this site. If you're bitching about /., you're not reading solid posts like the one above - and you're sure as hell not posting them.

    --
    Evan

  • All of this is entirely extrapolated from the two words: "GoIP" and "HeadCasting", which may *themselves* not even be real.

    The entire article is (very self-admittedly) made up of grapevine and pure flights of fancy. This is web-journalism at its very worst, showing just *why* you can't trust things off the web without looking at their source: He admits he made all this up!!!

    Even assuming the two buzzwords "GoIP" and "HeadCasting" are real, I'll buy the "Graphics over IP" concept (he dosen't even seem to know what the -oIP suffix stands for, just "it's some internet thing"), which leads to the next obvious idea, which is that HeadCasting is sending the graphics from one or more "Heads", i.e. displays, to a remote computer. Possibly involving hardware, possibly not.

    But then, that's completely *my* extrapolation, and no better (okay, a little better) than his.

    Besides... I can do that with my G400 right now. It's called X... or VNC, or any of a dozen different protocols. Maybe Matrox has a new Good Idea on the subject, though... assuming it has *anything* to do with remote graphics at all.

    --
    Evan "Happy Matrox user, not a gamer, YMMV"

  • Comment removed based on user account deletion
  • head != boobs
    +++++
  • Well, I was thinking that this whole GoIP and headcasting thing might be some interface which captures frames directly from the webcam or whatever the recording device is and sends it (possibly compressed) to the destination, without any software needed for capturing, sending etc...The Matrox Vid card on the other end can directly read these IP packets and extract the video information from them!
  • Called TRUFORM, this technology provides stunning image quality without the need for tonnes of polygons, and can be used to make older games look amazingly good.

    An overview is here [amdzone.com].

    The press release is here [amdzone.com].

    I will let the article speak for itself, particularly the images which shows that it really work.

    TRUFORM can replace pixel shaders apparently. Anyway, Matrox will have to have something more than "real bumps" to get more of the market back. The G550 is a stopgap though, about the level of the GeForce 2 MX, just to get Matrox more revenues through the door this year in preparation to next years "killer" graphics chip.

  • Theres nothing new about 'TruForm', the technique has been employed for years.

    In fact, ATI may even run afoul of Pixar patents, but i'm sure their legal department has dealt with this possibility.

    Its nice to be able to do it in hardware, but it wouldn't surprise me if all the work was done in software by the driver, rather than on the geometry engine.

    If the technique is implemented fully in hardware, then i don't see how exposing it as an OpenGL extension or whatever Direct3D uses could be a bad thing, since using sub-surfs where possible will certainly improve the appearance of most 'organic' 3D models.

    However, enabling this 'by default' on all geometry is almost certainly a stupid thing to do.

  • I think Matrox will have to delay their new chip ... after reading this article, the developers laughed so hard they needed a month to recover ...
    Can you imagine working on stuff that is described in the parent of this post, and then reading that article? Man, that would crack me up ...
  • check out www.headfone.com [headfone.com]. Sorta primitive, not free, but cool nonetheless.
  • by Emil Brink ( 69213 ) on Tuesday May 29, 2001 @03:50AM (#191566) Homepage

    I'm no regular reader of Chick's Hardware, so I don't have any "gut feel" about their reliability...

    I was confused by the layout of the article. It starts out with saying that it's based on rumors, but follows that up with what looks like a real feature-set specification (complete with corporate-style disclaimers). What gives? Is the G550 an announced product? I zoomed over to Matrox [matrox.com]' site of course, but couldn't find it. Not even on their press release [matrox.com] page... Weird. Can anyone clear that up for me?

    Anyway, the specs seem real enough (although I'm note sure what a "draw cell" is), and the speculation doesn't come off as too weird either... I guess we'll just have to wait and see what Matrox is planning. ;^)
  • by selectspec ( 74651 ) on Tuesday May 29, 2001 @04:11AM (#191567)
    They should get the author of that article and make him an editor here at /. His blood level of Trollocondrians ranks him as a FUDi-Master more powerful than even our master Katzoda.
  • According to http://www.xbitlabs.com/news/ [xbitlabs.com]:

    "But G550 3D core will also have some nice things. It will be DirectX 8 compliant due to Matrox's brand HeadCasting engine. This engine is none other but a T&L unit supporting extended Vertex Shaders and matrix palette skinning."
  • ...this is particularly crap. This guy is not familiar with the standard use of the phrase "head" in relation to video cards. To quote: "Head means head obviously".

    So what possible interest could his "thoughts and analysis" be?
  • Well, I disagree re: headcasting in the articles sense sucking becasue of low bandwidth; it sucks on so many more deeper levels.

    In fact the point of this kind of headcasting is to get low bandwith pseudo-video-conferencing by use of some sort of avatar. You watch what I am doing and send that, rather than a photo. "BigLig is smiling" is a shorter message than the video of me smiling. Then stick a digital photo of my face on a model, morph it to smile, va va voom.

    The thing is, though, that you need a crapload of stuff at my end to work out I am smiling, so if I at all care that you can see I am smiling that much I'll put the money into a bigger pipe rather than an expensive face scanning device, and send real video, which is better.

    And, returning to my earlier point that this article is clearly crap, even if you did want to do this kind of headcasting, you don't need anything fancy at the other end to display it. My much loved old TNT card could do this sort of rendering standing on it's head, so who needs extra stuff built into the video chipset?

  • by bill.sheehan ( 93856 ) on Tuesday May 29, 2001 @05:33AM (#191571) Homepage
    Since it appears that the original article is specious, at best, the only thing remaining to discuss is whether "Headcasting" is a good idea. To this discussion I will contribute one word: Bandwidth.

    Unless you and everyone you're talking to has lots and lots of bandwidth, any kind of "headcast" is a pipe dream. Either the picture of your correspondent is the size of a postage stamp, or there's a palpable delay that fouls up visual cues and leaves parties stumbling over themselves when they try to jump in to the perceived pause in the conversation.

    None of this is a slam at Matrox. I use a Matrox G450 myself and am very pleased with it.

    "This is CNN." "You must avenge my death, Kimba... I mean, Simba." "Luke, I am your father."

  • I don't know what the heck that mean and the writer doesn't do a very good job convincing me either.

    I have a hard time beliving sending heads over the internet has anything to do with the graphic card.

    And if he had known just a little bit about XFree he would know it can know do fast OpenGL over TCP/IP. At least on nvidia card and NVIDA doesn't boost any headcasting feature.

  • I'd at least check out the reviews before you count it out, maybe other people were more impressed.
    Unfortunately, but unsurprisingly, the reviews focus almost soley on the 3D performance. I'd like to find a hardware review site written by graphic design and Mac guys.

    A few years ago I was swapping hardware between an NT and a FreeBSD box, and I that was when I saw a first-hand comparison of the TNT2 and Xpert 98. Now in terms of 3D performance, the Xpert 98 can't touch the TNT2, but when I swapped cards and put the TNT2 in the FreeBSD box, I had to turn the monitor brightness up a whole lot to see the screen clearly. Conversely, I had to turn the brightness down on the NT box to keep from blinding myself. The TNT2 was dark and mushy on the desktop. Now that I have the GeForce, I wish I still had the Xpert98 for comparison.

    --

  • 2D is definitely not nVidia's strong point. The GeForce2 boards are definitely better than the TNT2 boards, but they still lack a certain brightness and crispness that was always present with my ATI Xpert98. If the G400 is better than that, I'm sure I'll be pleased. Thanks for the info.

    Remember how you used to have separate 2D and 3D chipsets? (i.e. the lame onboard chip for the desktop and the Voodoo1 for games.) I find myself pining for those days... I mean, the integration is great for most people, but I'd love to be able to use the Matrox for 2D and the GeForce2 for 3D, the best of both worlds. My GeForce2 (an ELSA Gladiac, btw) does look good in 2D, but it doesn't look great; it's obvious that more attention was paid to superior gamimg performance than a superior desktop.

    --

  • The idea behind HeadCasting or whatever this thing is called (assuming it is real anyway) has been around for quite long actually... For my final year in University I worked on a teleconferencing system which used head tracking on the server system and wireframe rendering on the clients...

    The object of this is that you end up transmitting just the movements of the head and the facial expressions of the speaker which of course uses MUCH less bandwidth than video in any case...

    On the client you simply rendered a head and used the speaker's image as a texture... Expressions were simulated by transforming the wireframe...

    It's not a bad idea if you think of it but the main problems with it were that it was pretty 'heavy' computationaly (because of all the head tracking and feature extraction stuff) and the output looked quite ugly... of course the more vertices the model had the better quality you get which I think is where G550 comes in handy :)
    • That hardly seems reasonable to me. In fact, it's just about the stupidest idea I've ever heard.
    What did you expect from a site called "chickshardware"?

    </TONGUEINCHEEK>

  • by BMazurek ( 137285 ) on Tuesday May 29, 2001 @03:34AM (#191577)
    Cool....now I can have my very own Max Headroom [maxheadroom.com].

    Well....at least I will be able to, twenty minutes into the future...

    The 80's....won't they ever go away? :)

  • On being offtopic: Have you used VNC? I've heard about it but haven't had a chance to test it out. Is it any good.

    It's quite excellent. I've been using it for about 8 months now. I like it much better than PCAnywhere since there's no client to use. The client is just a java applet that gets served out when you hit the server's port. My only complaint is that there's no logging for it that I could find.
  • by Echo|Fox ( 156022 ) <forums@disc[ ]ia.ca ['ord' in gap]> on Tuesday May 29, 2001 @04:03AM (#191579)
    That hardly seems reasonable to me. In fact, it's just about the stupidest idea I've ever heard. What in the bloody hell would be the point in accellerating wireframe heads for videoconferencing? Matrox ain't dumb ... hence the reason they've stayed out of the R&D expensive power gamer market and instead concentrated on the much more lucrative business market. I'd be highly surprised if "Headcasting" had anything to do with videoconferencing wireframe heads.

    What is it, then? One must look no farther than another one of Matrox's buzzwords: Dualhead, i.e. two displays off of one video card. When combined with GoIP, if it is indeed Graphics over IP, it seems much more "reasonable" that this instead means that you can broadcast one of your "heads" (read: displays) to another computer (or computers).

    Certainly nothing we haven't seen before in standalone applications (i.e. VNC and whatnot) but if this was tightly integrated into the Matrox drivers and very intuitive it would be pretty cool. Think about it, your primary monitor is showing your desktop with whatever you're working on. Your secondary monitor is using Headcasting to broadcast someone else's, while yours is broadcast to their second monitor. Quick and easy collaboration through the magic of the Internet.

    I can forsee all kinds of uses for such a thing, even just within an office, let alone worldwide.
  • They can do it now, just not in real time. And I'll sell you a video card can render pitch-black rooms photorealistically. Seriously, the time is coming. Extremely high end workstations still need hours to render single frames of CGI for film, but it's decreasing.
  • Ugh. No one wants TRUFORM type technology. People like John Carmack have repeatedly spoken out against nurbs and curved surface renderers in graphics cards. While TRUFORM doesn't take a curved surface as input, it still has to approximate and guess what the higher-polycount model would have looked like had it had actually had all the data points. If the card is capable of rendering lots of triangles (such as a GF3 or GF2) why not just blast all those tri's down the AGP pipe? That's the real question here, because you are decreasing the quality/size of a model for transfer over the AGP bus, and then raising the poly-count again. Is AGP4X that much of a bottleneck? For example, take a high quality Wav file, convert it to a low quality wav, then interpolate it back to a high quality wav. Play it back, and you will notice a loss in fidelity. The exact same thing will happen with the model no matter how good your interpolation is.
  • Sounds like one of these features people will find cool until they realize it's actually boring and tedious.

    Gi-gi-giv-give us full Ma-ma-max-ax-x Headroom or no-noth-nothing at all.

    (Better yet, make all women look like Lara Croft!) I didn't say that, did I?

    --
    All your .sig are belong to us!

  • It's interesting to see hardware manufacturers speculare on the direction techonology will take, and on top of that, watch columnists speculate on the intent of the hardware manufacturers. This is why thr whole thing is rather vague. It's quite possible that software developers might completely ignore the new featured offered in the hardware and go an entirely different way. That's one of the built-in risks in the hardware industry...

  • before GPU are able to render images that cant be difentiated from real images captured by a video camera ?

  • Comment removed based on user account deletion
  • by account_deleted ( 4530225 ) on Tuesday May 29, 2001 @05:08AM (#191586)
    Comment removed based on user account deletion
  • head != boobs

    No, but if you give the producer a little head, you can still get the part even if you don't have boobs!
  • Yeah, I think you're a lot closer to reality that the author of this bit of drivel. I personally liked Ant's summary [matroxusers.com] of this article over on the murc [matroxusers.com].

    In particular is this quote: "It all starts off OK but then all of a sudden the LSD the guy dropped must have kicked in."

    That about sums up the quality of this little bit of web journalism.
  • Hey I work at a university and presentation, distance learning and etc, are very much a useful app for having video and voice. Be careful not to live in a bubble there are useful applications to most everything it just may not be the most useful for you.
  • I couldn't get into the article to read it *too far away?* but if it is made up then why does slashdot even bother posting it yet. Why don't they wait until someone produces some benchmarks and they have servers for X11.

    On being offtopic: Have you used VNC? I've heard about it but haven't had a chance to test it out. Is it any good.
    ----
  • We need pictures to enable our skills at reading people's body language, which requires high resolution at a low frame rate, not like many teleconferencing systems of today - low resolution at a high frame rate. In the long run, it's probably best to concentrate on high frame rates at lower res. People can live with lower res because their can be some software rendering that can give it a virtual resolution which appeals to the eyes. If we go in the other direction, Low frame rate and high res then we can create virtual frames. Taking two frames, guessing the positions where the object would be between two frames. Unfortunately, this takes more logic and requires two frames, a beginning and an end for it to be calculated. This gives a slightly longer lag time because of those calculations and having to wait to put the calculated frame before placing the calculated frame. In 10 years or so, we'll have high res and high frame rate so this won't be a problem.
    ----
  • Resolution is only important after we get past 20 frames per second(the point the brains starts to perceive an image is in motion.) If we can't reach that, high res isn't important. If we have enough bandwidth for 20+ frames/second at 640x480+ then you can play with the res. If their not at the point, fps is more important for good motion.
    ----

"The following is not for the weak of heart or Fundamentalists." -- Dave Barry

Working...