Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

New Graphics Company, With Working Cards 461

gladbach writes "Toms Hardware has in their hands an actual working card, unlike other vaporware cardmakers *cough* bitboys *cough*... To quote Toms: 'A new player dares enters the graphics card market that ATi and Nvidia have dominated for so long. XGI (eXtreme Graphics Innovation), based in Taiwan of course, comes at the market leaders with a line of cards for a whole lot less money. We look at XGI's product range, and offer results of a beta model from XGIs top model Volari Duo V8 Ultra.'"
This discussion has been archived. No new comments can be posted.

New Graphics Company, With Working Cards

Comments Filter:
  • by kauttapiste ( 633236 ) on Sunday November 09, 2003 @02:12PM (#7429408)
    Well if they provide good support for X and OpenGL and maybe even with open source drivers, I'll be buying one instead of that NVidia I was planning to get soon.
    • Well according to Linux Hardware [linuxhardware.org] there will be XGI provided Linux drivers in a few months!
  • interesting (Score:2, Interesting)

    by Anonymous Coward
    thats pretty nice, hope they bring down prices... looks like their best will be priced at around $300. if it isn't a total flop, ATI and nvidia might face some competition. i'd sure like to get a high performance card for 300 or 350, and a nice midrange for around 100... we'll see. maybe a midrange card on sale might even go for $70. i've seen some 5600s @ $90, it could go down more
  • by Pingular ( 670773 ) on Sunday November 09, 2003 @02:15PM (#7429423)
    A new player dares enters the graphics card market that ATi and Nvidia have dominated for so long
    What about Matrox [matrox.com], who've been dominating the multiple monitor graphics card market for years?
    • by Anonymous Coward
      Oh man, now prepare for FPS boys, you used the evil Matrox word ;-)

      They dominate scientific market too... Especially Medical imaging...

      Perhelia is a hit amongst pro gfx people...

      But none of them will let you play Quake III in 500 fps
    • by Anonymous Coward
      The difference is that XGI makes dirt cheap slow parts, while Matrox makes ridiclously expensive slow parts.

      Seriously: it sounds like Matrox is going down the tubes. They recently downgraded their card from DX9 to DX8 and have been having major problems getting new driver releases out and dealing with game compatibility. There might be a handful of 2D applications that need Matrox cards, but it's not a good general purpose solution nor anything you should even try for gaming.
      • While one can play games on a Matrox, they were NEVER the best for any 3D games, so I wouldn't use gaming performance as a reason that they are going "down the tubes". While games are a significant part of the graphic card segment, they are hardly the only part, and because the company really doesn't compete in games doesn't mean they don't have a strong market niche.

        For everything else that isn't 3D gaming, I've found Matrox cards to do plenty well and they did everything I wanted, plus a few things thei
    • by Ark42 ( 522144 ) <slashdotNO@SPAMmorpheussoftware.net> on Sunday November 09, 2003 @02:25PM (#7429502) Homepage


      I personally find NVidia's TwinView to work a lot better, if you have a card with two outputs, which most of the GF4s and higher seem to have now.

      We might as well ask about S3 [diamondmm.com] if we're asking about Matrox. Remember that great card they had a while back?

      The best we can hope for is a pricewar I think. Cheaper Nvidia or ATI cards is always better.

      • Of course, for gaming ATI and nVidia are better but I've always had better experiences with Matrox for business and CAD apps.

        For most features on dual head, ATI, nV and Matrox are pretty close although there are two or three features that some people find rediculously handy.

        The one thing that Matrox can do that I haven't found in ATI or nVidia is to be able to have two signal outputs that are identical (dual head clone). That way I can have the exact same output for both my video projector and a 21" CRT
      • by rsmith-mac ( 639075 ) * on Sunday November 09, 2003 @04:17PM (#7429914)
        FYI, Diamond is not S3 any more. S3's core graphics division(i.e. the chip engineers) was merged with VIA, and the rest of S3 was sold piecemeal. Best Data bought the remainder of the graphics division(the names, the card designs, etc) and are relaunching Diamond MultiMedia as a video card company making ATI and Nvidia cards.
      • by Anonymous Coward
        For analog video (CRTs) Matrox has always been the best as far as image quality.

        ATI and nVidia (traditionally) had horrible video quality (blurry edges, ghosted edges due to ringing, etc)

        Awhile back, I helped a friend of mine upgraded his video card to gain dual display. The first thing we tried (his first choice) was some type of GeForce4 card - it worked, and ran games plenty fast, but the image quality was not really that great particularly for high resolution desktops (Dual 19" Sony CRTs) and the coo
      • But have you taken a look at their Parhelia HR256?
        Quad DVI interface to drive LCD's at 3840 x 2400 pixels... =)
        Now we're talking space!
        Ok, maybe not something for those gamers out there. Don't think that the Parhelia, or any other card, has the power to run any kind of game at reasonable speeds at those resolutions.
        But for any kind of serious work: WOW!
        I'd love one of these! =)
    • Matrox? I haven't seen a Matrox card in action for years. Nvidia's TwinView seems to be dominating that segment now.

      At least for people who care about performance, anyway.
    • Another company, S3, is back with a new line of cards here [s3graphics.com]. They are also looking to break into the market this quarter.
    • by Jucius Maximus ( 229128 ) on Sunday November 09, 2003 @03:34PM (#7429767) Journal
      "What about Matrox [matrox.com], who've been dominating the multiple monitor graphics card market for years?"

      True. IMO Matrox is best for non-gaming applications. One thing a lot of people forget is that Matrox is a significant player in the digital video products that deal with video capture and real time editing. Their RT.X [matrox.com] series of DV editing products are among the best in the price range.

    • What about Matrox...

      Don't even get me started! :) I just ordered an ATI Radeon 9800 Pro to replace my Matrox Parhelia card. The linux drivers that are available for the Parhelia are lousy, not accelerated*, not officially supported (meaning if you can't make it work, tough luck), and only available for RedHat 9. (They might work with other distros but *I* (your milage may vary) couldn't get them to compile under Mandrake or Debian.)

      * If you ask politely, they'll e-mail you a link to an accelerated dri
  • by stone2020 ( 123807 ) on Sunday November 09, 2003 @02:15PM (#7429425)
    Did Slashdot just discover Tom's Hardware this week?
  • by dolo666 ( 195584 ) * on Sunday November 09, 2003 @02:16PM (#7429430) Journal
    I remember when Nvidia came into big production, and a lot of us Voodoo owners were skeptical at first that anyone could oppose the great Voodoo cards from 3dfx (the same cardies that revolutionized how we all play Quake). Smart business policy, quality hardware and lots of blood, sweat and tears have pushed Nvidia to where it is today. As a gamer, I welcome any new blood to the table, because it just means that the race for the mother of all graphics suites is getting that much more interesting... and the road is shorter when the competition is fierce!

    Competition is the mother of invention, if necessity can't possibly be. :)

    Seems that XGI is going after some odd designs, using the fabled 3dfx dual chip design as a way to get more bang for the buck. It's not a solution, as Tom's Hardware reveals that this results in more problems. The problem? Half-norm memory usage. *ouch*!

    Still, this is the first line for XGI. I'm sure we'll see a lot more from them, if they don't go broke.
  • Drivers (Score:5, Insightful)

    by Indio_do_Xingu ( 675644 ) on Sunday November 09, 2003 @02:17PM (#7429438) Homepage
    The most difficult thing is to release good drivers. Until then, I will wait to see how they really perform.
  • by Chris_Stankowitz ( 612232 ) on Sunday November 09, 2003 @02:18PM (#7429447)
    From the article: Of course, there are no official comments on who is pulling the strings, but it seems clear that XGI is on solid financial ground.

    And that is all I really wanted to hear. Thank goodness they won't be a fly-by-night company. This is a very welcome addition to the market. Lets also hope that they either make linux drivers or open the arch so developers can do so.

    The great thing about new companies with financial legs to stand on is that they can learn from the mistakes of others without having to make them themselves as well as learn from the things done right by other companies.

    I for one welcome our future (their mission statement is to be #1 by 2007) GPU overlords!

  • by W2k ( 540424 ) on Sunday November 09, 2003 @02:21PM (#7429469) Journal
    I don't foresee this going very well for XGI. Firstly, look at the cards. Dual chips, non-shared memory? 256 megs on the card, only 128 available because the chips can't share. "Wasting" 128 megs might be acceptable, considering the card is still pretty cheap, but how about when high-end cards start coming with 512 MB or more? If XGI start putting 1024 MB of memory on their cards they are going to see any advantages their cards may have in pricing go bye-bye pretty quickly. Remember, going for quantity rather than quality was what killed 3dfx. How quickly some people forget :)

    Second problem is that due to the size of the card, it's not gonna fit in smaller form-factor PC's. Why they put such a huge HS on the back of the card, where there's usually not much space, versus just putting more cooling on the front of the card, where high-end users (of nVidia cards, anyway) are already accustomed to leaving a PCI slot open to make room, is beyond me.

    Those two big fans they've stuck on their reference board sure aren't going help keep noise levels down, either. My (reference) Radeon 9800 Pro still beats the crap out of most cards on the market today, and it's only got a small HSF for the chipset and nothing on the memory chips. And I was still able to OC it quite a bit. If nVidia's and XGI's chips really require as much cooling as manufacturers stick on them, even on "reference" boards, they must be very inefficient chips indeed. These things aside, it's always nice to see more competition in the graphics chipset business, hopefully prices might come down a bit as a result if ATi and/or nVidia see XGI as a real contenter, rather than a wannabe like Matrox (though I don't know if they're even at the "wannabe" level any longer, considering how poor their chips are nowadays).
    • Ok, Nvidia/ATI has some drawbacks compared to Matrox and other highend makers. It may kill them on Price/Performance, but people buy CPUs with crappy price performance (Xeon, P4EE (well theoreticly at least!)) when they could get chips with a much better price/performance ratio.

      Matrox & other professional cards simply look better for most things, and Nvidia doesn't support some things sub-pixel antialiasing.

      An example of this: http://www.3dlabs.com/product/technology/Wildcat_ Q uality_ro.doc (It's a

    • by BrookHarty ( 9119 ) on Sunday November 09, 2003 @03:32PM (#7429758) Journal
      Remember, going for quantity rather than quality was what killed 3dfx. How quickly some people forget :)

      Well, I disagree. The reason 3DFX was killed off, the partners decided that they wanted to be the only vendor for 3dfx boards, they killed off all 3rd party vendors. Then they bought STB so they could release boards, and that never panned out. So with the 3DFX limited release, nvidia expanded with new OEM partners, and easily took over the market. Vendors need to release boards, even if cheap OEM boards, they need a product to fill. 3DFX took that away from all the smaller companies. And believe it or not, lots of people buy OEM quality products. Even CompUSA and main stores carry an OEM selection selection with a sticker on the box to make it look retail.

      3DFX could of lasted another 2 years with the multi-gpu design, and had another GPU in development. The Partners tanked the company, sold few a few million, and walked away. They had better visual quality at the time. OpenGL was a little flaky, but could of been fixed with drivers, they had a great developteam that moved on to other GFX companies.

      I had a Voodoo5 and a GF3-ti500 both the hottest cards out at the time. The Voodoo5 with only 2 CPU's kept up almost every game. But after the drivers stopped from 3DFX, you had to move to different hardware.

      This is where Sis+Trident=XGI can rock. Trident has been making OEM chipsets for years, Sis makes motherboard chipsets. With a good design, and to use Multi-GPU's to make boards faster, they can cut a good niche out of the low end market.

      ATI and NVidia use the same CPU cores for most boards, they just cripple and use the GPU's that dont pass high end tests. This is why when the ATI 9500 came out, you could driver hack it to a 9700. Also they use less expensive ram, and limit the hardware. This doesnt exactly save that much money, a few bux, but the selling price can be 50-100 dollars more. Most people wont spend 400 bux on a high end GFX card, but the 99 dollar sweet spot is a big field, and if XGI can come up with a card to fill that niche, they could come out as the heavy hitter and take over some major piece of the market.

      Also, back in the .com days, SIS, Trident made most of the low end, SVGA cards that powered almost every server I seen. Until Intel started putting gfx chips on boards. They have the background, and with the man power out in the market right now, they could easily take 10-15% of the market on Initial release.

      Of course, after reading the article, 300 bux release, and not the top preformer, they better tweak the hell out of those drivers. Maybe multi-core could help, but it at the performance of the 9500/5700, it needs a little more performance. Maybe driver tweaks, since its still beta could bump it up. Nvidia and ATI are already tweaking the drivers like crazy if you watch the benchmarks out everymonth, so tweaking and optimizing drivers does help.

      -
      Power by Nvidia, pfft. Spend more time fixing bugs, and less time suckling on the nvidia tit...
  • by Anonymous Coward on Sunday November 09, 2003 @02:21PM (#7429471)
    XGI is just Trident with a new name, and Xabre added in

    From webpage:

    'Founded in May of 2003 and headquartered in Taipei, Taiwan, XGI pulls from a deep reservoir of engineering and design talent stemming from its acquisitions of Trident Microsystems, and Silicon Integrated Systems' graphics divisions.'

    Although it says Taipei, most of the hard core engineers are in san jose, in the old trident building, right across from Fry's electronics. Go say hi, they're a nice bunch
    • The rank-and-file may be nice people but I don't remember any Trident or SIS graphics chip living up to the hype so they end up getting relegated to the budget market.

      I wish them luck, but I hope that they have something up their sleeves to fix all the defficiencies found in Tom's testing.
    • Righto. XGI will be irrelevant in the Windows because of price/performance compared to the lower-end ATI and Nvidia parts. It could become relevant in the Linux world if they had open source drivers or if they simply would release specs for programming the chips. They haven't so far, so even with a promised binary X11 driver, they're destined for the scrap heap of Chapter 11 video card companies.

  • Whoa!! (Score:5, Funny)

    by garglblaster ( 459708 ) on Sunday November 09, 2003 @02:23PM (#7429481) Journal
    ..XGIs top model Volari Duo V8 Ultra

    WOW!!

    but - wait a minute - guess I'll wait for the Volari Quattro Triplex V12 Turbo GTI XXL
    _ThAT'll be a nice graphics card!

    • Ya know, I actually think that the names of the parts are the coolest part, so far. The benchmarks and screenshots definitely left something to be desired.

      Namely, quality and efficiency.

      Then again, I'm still using a Radeon 7000 PCI, and a Voodoo3 in my two computers. So.. I can't bitch much.

    • why wait, SGIs already ship with V12 graphics.

      See. [sgi.com] Theyre not cheap though.

  • external video bridge chip SIS301 MV, which is also responsible for the TV output

    Why oh why do companies insist on using SiS's chips for TV out! It's horrible!! VIA used them on the Epia boards, and it stinks! Great little boards don't get me wrong, but it's TV out is nasty. Now these guys. Will the pain never end??!

    [/sis rant]

    But good luck to them :) Competition is good!
  • IMHO, don't expect a miracle. Don't forget, long-standing, legitimate graphics card makers (such as Matrox, or years ago - Hercules) all tried to get into the high-end 3D type accelerator card market and failed miserably. Even lately, you hear some rumbling coming from Matrox every once in a while about some new "killer 3D card", and it always turns out to effectively mean nothing.

    ATI and nVidia being the only 2 real competitors in the 3D gaming card market isn't such a horrible thing. (Yes, things woul
    • by Anonymous Coward on Sunday November 09, 2003 @03:32PM (#7429760)
      Matrox used to be a leader in the graphics market. I lived in Dorval and I now some people who used to work there. The story is they started to get all uppity about potential new employees' grade point averages. They would only hire people with a GPA above 3.8.


      The thing is, to get these marks you either cheat, or are an idiot savant, or effectively a genius. Now you put all these people in the same room, what do you get? Superior products? No.


      You get ego clashes, clueless idiots, hangers-on and cheaters who couldn't design a 10ms monostable with a 555 and a book from Radio Shack. NO real-world experience, NO real skills whatsoever.
      The Matrox you see today is due to universities run wild and employers being blinded by them.


      Just another example of the irrelevance of university to real-world problems.

    • Au contraire! You forget that those were the days of the API wars, before DirectX on windows and before cards could really do a full OpenGL implementation (hurrah for game engine specific ICDs, heh)

      These days, everyone writes to DirectX, period. Whether they use Direct3D or OpenGL is irrelevant, they all use DirectInput, Directsound, so on and so forth.

      This of course means that a graphics card maker need only supply DirectX and OpenGL support and users can run their games on the card. Granted - card speci
  • by DeathPenguin ( 449875 ) * on Sunday November 09, 2003 @02:45PM (#7429581)
    OpenGL performance in Quake 3 [tomshardware.com] and Enemy Territory [tomshardware.com] on these boards roughly matches that of a comparably priced GeForce FX 5600.

    Drivers haven't been tested, but LinuxHardware reports [linuxhardware.org] that Linux drivres will be available in Within the first quarter of next year. Let's just hope it doesn't suck suck and that there are some real perks of running an XGi over a GFFX5600.
    • Er, let's just hope they don't suck as much as my editing/grammar skills.

      Correction: Drivers haven't been tested, but LinuxHardware reports that Linux drivers will be available in within the first quarter of next year. Let's just hope they don't suck and that there are some real perks of running an XGi over a GFFX5600.
    • The quake3 page shows something interesting, but look at the scores on the ET page. It comes in dead last by a long shot. Looks roughly to be in GF3 range. That's kinda typical for most minority GPU vendors I've seen over the years. They can usually come close to matching entry level cards and a benchmark or two but fall down miserably on the vast majority of benchmarks. Think Kyro etc.

      Also these "3rd party" GPU vendors tend to not have the resources to implement truly state of the art functions. So while
      • Of course, they are shooting pretty high for their first release. Look at where they claim they'll be positioning their products in the market [tomshardware.com]. They claim their high-end cards will be able to perform on the same level as ATI's (current) highest level card, the Radeon 9800.

        I really have to wonder if they'll be able to properly pull this off. Look at nVidia and ATI's entries into the high-end graphics market; it took them a few product cycles to achieve equality and then a few more to get dominance.

        Of co
        • Of course, if they can even reach parity with the middle high-end chips (which may be low high-end chips by the time XGI finally ships) and price things at or below the competion, we could have something interesting on our hands.

          This and open sourced drivers which are managed good enough to gather the talent out there.
          50% of all linux users shifting their buying decisions to XGI cards for their next buy just because of this drivers could help them a lot at the beginning. Maybe they are lucky, and are at t
  • by Space cowboy ( 13680 ) on Sunday November 09, 2003 @02:46PM (#7429586) Journal
    Interesting that if you look at the graphs, the blue line doesn't appear to do that well (mid-table stuff) but reading the text, it's currently only running at 450MHz not the specced 500MHz and with beta-level drivers.

    Multiply its figures by (at least) 500/450 and they look a lot better - normally just (really just, indistinguishably so) behind the leaders...

    Cheaper, huh ? About time too :-)

    Simon
  • Comment removed based on user account deletion
    • Actually there was a Plymouth Volare back in the late 70s. My parents owned one of those. What a major piece of crap.

      I had my first accident in one of those, before I was even old enough to drive. Thought I'd do my parents a favor and back it into the garage for them before a snowstorm. Hit the side of the door with it.
  • by ewhenn ( 647989 ) on Sunday November 09, 2003 @03:00PM (#7429646)
    ... and looked at the results, some were abysmal. Yes, this is still a ref. board, but it seems to run a game great for the price, or horrible for the price. Wolf ET is based on the Quake3 engine, yet Q3 is dramatically higher than Wolf ET (this card came in LAST place in Wolf ET), and this is wiht games based on the same engine. Sorry, I would rather shell out 175$ for a car that will consistantly perform decent across the whole spectrum, than pay 300$ for a card that is sometimes great and sometimes sucks.

  • by stubear ( 130454 ) on Sunday November 09, 2003 @03:16PM (#7429704)
    Let me know when the Type-R model is available though.
  • Tom's Hardware (Score:5, Informative)

    by Sivar ( 316343 ) <charlesnburns[@]gmail...com> on Sunday November 09, 2003 @03:18PM (#7429713)
    Ah, Tom's Hardware. Not trying to be negative, but IMHO, they are a terrible source for tech information, and the bulk of their reviews contain startling errors, conclusions that defy reason, glaring omissions, and sensationalized reporting.
    The majority of those writing the reviews clearly have no idea what they are talking about, at least regarding the subject they are reporting on. Overall, I would rate them slightly above HotHardware.com.
    Tom himself, as far as I can tell, is on the ball and knows his stuff VERY well, but he doesn't write articles much anymore, and obviously doesn't read them either.
    It is a common practice among hardware enthusiasts to quote Tom's for the humor value, trying to see if the author of the latest article is even more clueless than he was in his (or her) last article.
    To be fair, they do have some excellent articles occasionally, and were the first ones to dare publish information on Intel's unstable Pentium III 1.13GHz [tomshardware.com] processor, but unfortunately these seem to be the exception rather than the rule.

    Also, as has already been stated, XGI is hardly a new company. Of course, these bits of SiS and Trident are in completely new territory if they are trying to compete in the high-end gamer's market. Considering that this is their first real foray into that market, I think they have done an amazing job. I'd say give them the benefit of the doubt until they prove otherwise. Remember, even the (once) most respected [nvidia.com] companies in the field can faulter, and that XGI has something that is even in the same ballpark as the most seasoned of players is an impressive feat.
  • I'll say it....... (Score:3, Informative)

    by Veramocor ( 262800 ) on Sunday November 09, 2003 @03:23PM (#7429728)
    I'll say it because no one else will. Tom is an ass.

    But more importantly his reviews suck! I haven't even looked at this one but i venture to guess it's at least 15 pages, milk that advertising cash cow, tom!

    If brevity is the soul of wit, then you are one dumb mother fucker Tom.

    Now for the informative part of by rant:

    try www.hardocp.com
    or
    www.anandtech.com
    or
    www.ac eshardware.com

    all 100% better than tom's
    • by Hektor_Troy ( 262592 ) on Sunday November 09, 2003 @04:24PM (#7429935)
      So ... you feel that Tom's Hardware is completely useless. Fair enough, everyone is entitled to their opinions.

      You then say that [H]ard|OCP, Anandtech and Ace's Hardware are 100% better than Tom's Hardware. Fair enough, everyone is entitled to their opinions.

      However - 100% of useless is still nothing, so what you're saying is, that [H]ard|OCP, Anandtech and Ace's Hardware are also useless. I don't think that's what you were going for ...
    • Whatever you say may be generally true but Tom is the only one who's gotten his hands on an XGI card.
  • by kenneth_martens ( 320269 ) on Sunday November 09, 2003 @03:33PM (#7429766)
    XGI is a new player in this market and need something to distinguish themselves from the competition. This is an opportunity to persuade them that supporting Linux by releasing drivers would gain them positive reviews and have an impact on sales. Linux is gaining in popularity in the enterprise and server areas, so announcing Linux support for their products would sort of *legitimize* XGI's cards. It's worth a shot--the question is, how do we convince them?
  • The textures are washed out and lack detail. Some rendering is wrong. The "dual chip" version is always displaying a frame behind. But it has a really l33t FPS rate on Quake.
  • by drix ( 4602 ) on Sunday November 09, 2003 @03:55PM (#7429849) Homepage
    The good news: a new, cheaper GFX company bursts on to the scene to challenge Ati and Nvidia dominance.

    The bad news: the cards suck ass [tomshardware.com].

    So basically, nothing has changed since you woke up this morning.
  • by master_p ( 608214 ) on Sunday November 09, 2003 @03:58PM (#7429859)
    A new graphics company enters the game...yawn. They sure have no aces up their sleeves. They could though, if they had used a parallelizable engine (tile rendering, for some), and they could just add more processors as new models are requested by the market. They could also make cards upgradable.
    • They sure have no aces up their sleeves.

      While in hardware they have none, they have a strong one outside that. Price.

      (personally I'd like to buy a new, decent gfx card. But all new decend gfx cards cost a fortune I'm just not ready to spend. So...
      • What do you mean by 'decent'? What do you consider an acceptable price range?

        Radeon 9200s are currently in the $50 price range and have 'decent' performance for most things on the market (you may have to cut back resolution and IQ for some cutting edge games). For 3-4x the price ($150-200 range for all but ATI's top cards)) you can get some extra performance (less than 2x) and pick up DX9 and better IQ.
  • by scrytch ( 9198 ) <chuck@myrealbox.com> on Sunday November 09, 2003 @04:07PM (#7429887)
    I will never visit or recommend tomshardware again as long as they run ads for the Business Software Alliance.
  • Taiwan of course? (Score:2, Insightful)

    by Moridineas ( 213502 )
    No denying that Taiwan is HUGE, but why is this an of course?

    Let's look at the other main graphics card companies--Nvidia, ATI, Matrox.

    Nvidia--California based.

    ATI--Canada based.

    Matrox--Canada based.

    Now, if we were talking motherboard manufacturers, things might be different...
  • AMD has helped prices in the processor market for years but there has not really been a player willing to wage a price war in the graphics card market. Now maybe those 500 dollars cards will come down a bit.
  • It's simple: make sure that the card is well-supported with decent drivers, and you have a sale!

    Ignore suport for Linux... and you're not even on the radar for me.

  • by jazuki ( 70860 ) on Sunday November 09, 2003 @05:09PM (#7430101) Homepage
    THG is a little off on the Latin:

    1. Volatus means flight, not velocity

    2. Volari appears to be closer to volare, meaning "to fly"

    3. Velocity comes from the Latin velocitas (meaning quickness), which is derived from velox (meaning quick)

    While the fact that the two stems (velo- and vol- for velox and volare) share initial consonants suggests that there is a relationship between these stems, this relationship is more likely to have arisen in some proto- or pre-Latin than Latin itself.
  • by Anonymous Coward on Sunday November 09, 2003 @05:11PM (#7430108)
    I am a PC game developer. We recently released a W32 game which uses OpenGL.

    Some hardware companies were especially responsive when we found driver bugs and in general were very helpful and an absolute pleasure to work with. nVidia's developer relations team is head and shoulders above the rest.

    Just about all others were very good. I don't think it would be fair to expect anything more than what all the rest gave. I'd put Matrox, Intel, ATI, 3Dlabs, and S3 in this category.

    Then there was SiS and Trident. We experienced tremendous bugs with the Xabre. Numerous times I offered small code examples to reproduce these bugs. These are very big bugs; crashes if you use a 3d texture, crashes if you use certain keywords in a vertex program, incorrect rendering of primitives, etc. It was around a year before we shipped I started to contact IHVs to ensure any bugs they had could be worked out. SiS and Trident didn't give a damn.

    It is one thing to develop good hardware. That is 1/3 of the task. Companies have been doing that for years. Remember the g400? Great hardware, aweful drivers. The next 1/3 is making everything run correctly. ATI, which is pretty good these days, is still fixing bugs like flipped textures in Flight Sim 2004 (which would obviously affect other games, FS04 is just the first to expose it) in their latest drivers. How mature can we expect the drivers to be from a new company? Not too good, given no new company has done it first try and these do have a track record of being aweful.

    The final 1/3 of the driver is to extract maximum performance. I'm not too worried about this unless it comes before the other more important parts as generally this is some form of cheating; or as Trident called it "application specific acceleration"

    I truely pitty somebody who buys one of these cards because it gets the same 3dmark score as a GeForce or Radeon, but costs $10 less. There is more to a good card than the scores it gets in the most popular benchmarked games, it is more important that it runs all the games you want. The Xabre can't run FS2004, Splinter Cell, Homeworld2, etc. Atleast Intel's integrated graphics which may not be that quick do run everything reasonably well.

    Please remember this when you recommend a system for a friend. Please insist they either spend the extra money for an nVidia or ATI product or simply get a card in their price range with less raw 3dmark performance from nVidia or ATI.

    You get what you pay for when it comes to software support!

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...