Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

New Graphics Company, With Working Cards 461

gladbach writes "Toms Hardware has in their hands an actual working card, unlike other vaporware cardmakers *cough* bitboys *cough*... To quote Toms: 'A new player dares enters the graphics card market that ATi and Nvidia have dominated for so long. XGI (eXtreme Graphics Innovation), based in Taiwan of course, comes at the market leaders with a line of cards for a whole lot less money. We look at XGI's product range, and offer results of a beta model from XGIs top model Volari Duo V8 Ultra.'"
This discussion has been archived. No new comments can be posted.

New Graphics Company, With Working Cards

Comments Filter:
  • interesting (Score:2, Interesting)

    by Anonymous Coward on Sunday November 09, 2003 @02:13PM (#7429413)
    thats pretty nice, hope they bring down prices... looks like their best will be priced at around $300. if it isn't a total flop, ATI and nvidia might face some competition. i'd sure like to get a high performance card for 300 or 350, and a nice midrange for around 100... we'll see. maybe a midrange card on sale might even go for $70. i've seen some 5600s @ $90, it could go down more
  • by Ark42 ( 522144 ) <slashdotNO@SPAMmorpheussoftware.net> on Sunday November 09, 2003 @02:25PM (#7429502) Homepage


    I personally find NVidia's TwinView to work a lot better, if you have a card with two outputs, which most of the GF4s and higher seem to have now.

    We might as well ask about S3 [diamondmm.com] if we're asking about Matrox. Remember that great card they had a while back?

    The best we can hope for is a pricewar I think. Cheaper Nvidia or ATI cards is always better.

  • by King_TJ ( 85913 ) on Sunday November 09, 2003 @02:29PM (#7429520) Journal
    IMHO, don't expect a miracle. Don't forget, long-standing, legitimate graphics card makers (such as Matrox, or years ago - Hercules) all tried to get into the high-end 3D type accelerator card market and failed miserably. Even lately, you hear some rumbling coming from Matrox every once in a while about some new "killer 3D card", and it always turns out to effectively mean nothing.

    ATI and nVidia being the only 2 real competitors in the 3D gaming card market isn't such a horrible thing. (Yes, things would be MUCH worse if we only had ONE major player.) As long as there are two, it keeps innovation and competition alive between them - and it reduces the support headaches for software developers.

    It wasn't THAT long ago, you almost had to select your 3D card purchase based on how much you liked the "special edition" titles bundled in the box - because you never knew how many future games would actually be written to support the thing! (I recall buying a Matrox Mystique like this, for example. It came with a Mystique-accelerated version of MechWarrior and a few other decent games - but I barely found anything else that supported it afterwards.)

    The old line about "too many cooks spoiling the soup" holds true for too many competing brands on the video card marketplace, too.
  • by Space cowboy ( 13680 ) on Sunday November 09, 2003 @02:46PM (#7429586) Journal
    Interesting that if you look at the graphs, the blue line doesn't appear to do that well (mid-table stuff) but reading the text, it's currently only running at 450MHz not the specced 500MHz and with beta-level drivers.

    Multiply its figures by (at least) 500/450 and they look a lot better - normally just (really just, indistinguishably so) behind the leaders...

    Cheaper, huh ? About time too :-)

    Simon
  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Sunday November 09, 2003 @02:50PM (#7429603)
    Comment removed based on user account deletion
  • by Space cowboy ( 13680 ) on Sunday November 09, 2003 @02:54PM (#7429622) Journal
    I agree with most of that. I'm not sure the memory thing is an issue at the moment though - memory is dirt cheap. Huge chips are more than 2x the cost of smaller ones, for only 1.5x the transistor count...

    I dunno. Could be a shrewd move, could be a monumental screwup :-)

    Simon.
  • by ewhenn ( 647989 ) on Sunday November 09, 2003 @03:00PM (#7429646)
    ... and looked at the results, some were abysmal. Yes, this is still a ref. board, but it seems to run a game great for the price, or horrible for the price. Wolf ET is based on the Quake3 engine, yet Q3 is dramatically higher than Wolf ET (this card came in LAST place in Wolf ET), and this is wiht games based on the same engine. Sorry, I would rather shell out 175$ for a car that will consistantly perform decent across the whole spectrum, than pay 300$ for a card that is sometimes great and sometimes sucks.

  • by reaperbean ( 453437 ) on Sunday November 09, 2003 @03:00PM (#7429649) Homepage
    Well according to Linux Hardware [linuxhardware.org] there will be XGI provided Linux drivers in a few months!
  • by Jeff DeMaagd ( 2015 ) on Sunday November 09, 2003 @03:14PM (#7429696) Homepage Journal
    The rank-and-file may be nice people but I don't remember any Trident or SIS graphics chip living up to the hype so they end up getting relegated to the budget market.

    I wish them luck, but I hope that they have something up their sleeves to fix all the defficiencies found in Tom's testing.
  • by flewp ( 458359 ) on Sunday November 09, 2003 @03:25PM (#7429735)
    I agree. I've used ATI cards for about 5 to 6 years now. I've had 3 cards, and they've all run great, never had a problem. Some people just get an idea in their head and they refuse to believe otherwise.
  • by Anonymous Coward on Sunday November 09, 2003 @03:25PM (#7429737)
    About those extra caps.... It's very simple. Modern high speed logic behaves more like RF circuitry than just 1s and 0s. In the fast turnaround/low profit margin world of these cards, it's easier to add lots of extra capacitors ahead of time, then remove them one by one while testing for EMI/RFI/noise problems. Keeps the cost down, and you don't need to buy 50000$ RF simulation software with a PhD to run it. Another example of how real-world considerations make university educations questionable in the future.
  • by BrookHarty ( 9119 ) on Sunday November 09, 2003 @03:32PM (#7429758) Journal
    Remember, going for quantity rather than quality was what killed 3dfx. How quickly some people forget :)

    Well, I disagree. The reason 3DFX was killed off, the partners decided that they wanted to be the only vendor for 3dfx boards, they killed off all 3rd party vendors. Then they bought STB so they could release boards, and that never panned out. So with the 3DFX limited release, nvidia expanded with new OEM partners, and easily took over the market. Vendors need to release boards, even if cheap OEM boards, they need a product to fill. 3DFX took that away from all the smaller companies. And believe it or not, lots of people buy OEM quality products. Even CompUSA and main stores carry an OEM selection selection with a sticker on the box to make it look retail.

    3DFX could of lasted another 2 years with the multi-gpu design, and had another GPU in development. The Partners tanked the company, sold few a few million, and walked away. They had better visual quality at the time. OpenGL was a little flaky, but could of been fixed with drivers, they had a great developteam that moved on to other GFX companies.

    I had a Voodoo5 and a GF3-ti500 both the hottest cards out at the time. The Voodoo5 with only 2 CPU's kept up almost every game. But after the drivers stopped from 3DFX, you had to move to different hardware.

    This is where Sis+Trident=XGI can rock. Trident has been making OEM chipsets for years, Sis makes motherboard chipsets. With a good design, and to use Multi-GPU's to make boards faster, they can cut a good niche out of the low end market.

    ATI and NVidia use the same CPU cores for most boards, they just cripple and use the GPU's that dont pass high end tests. This is why when the ATI 9500 came out, you could driver hack it to a 9700. Also they use less expensive ram, and limit the hardware. This doesnt exactly save that much money, a few bux, but the selling price can be 50-100 dollars more. Most people wont spend 400 bux on a high end GFX card, but the 99 dollar sweet spot is a big field, and if XGI can come up with a card to fill that niche, they could come out as the heavy hitter and take over some major piece of the market.

    Also, back in the .com days, SIS, Trident made most of the low end, SVGA cards that powered almost every server I seen. Until Intel started putting gfx chips on boards. They have the background, and with the man power out in the market right now, they could easily take 10-15% of the market on Initial release.

    Of course, after reading the article, 300 bux release, and not the top preformer, they better tweak the hell out of those drivers. Maybe multi-core could help, but it at the performance of the 9500/5700, it needs a little more performance. Maybe driver tweaks, since its still beta could bump it up. Nvidia and ATI are already tweaking the drivers like crazy if you watch the benchmarks out everymonth, so tweaking and optimizing drivers does help.

    -
    Power by Nvidia, pfft. Spend more time fixing bugs, and less time suckling on the nvidia tit...
  • by Anonymous Coward on Sunday November 09, 2003 @03:32PM (#7429760)
    Matrox used to be a leader in the graphics market. I lived in Dorval and I now some people who used to work there. The story is they started to get all uppity about potential new employees' grade point averages. They would only hire people with a GPA above 3.8.


    The thing is, to get these marks you either cheat, or are an idiot savant, or effectively a genius. Now you put all these people in the same room, what do you get? Superior products? No.


    You get ego clashes, clueless idiots, hangers-on and cheaters who couldn't design a 10ms monostable with a 555 and a book from Radio Shack. NO real-world experience, NO real skills whatsoever.
    The Matrox you see today is due to universities run wild and employers being blinded by them.


    Just another example of the irrelevance of university to real-world problems.

  • by kenneth_martens ( 320269 ) on Sunday November 09, 2003 @03:33PM (#7429766)
    XGI is a new player in this market and need something to distinguish themselves from the competition. This is an opportunity to persuade them that supporting Linux by releasing drivers would gain them positive reviews and have an impact on sales. Linux is gaining in popularity in the enterprise and server areas, so announcing Linux support for their products would sort of *legitimize* XGI's cards. It's worth a shot--the question is, how do we convince them?
  • by master_p ( 608214 ) on Sunday November 09, 2003 @03:58PM (#7429859)
    A new graphics company enters the game...yawn. They sure have no aces up their sleeves. They could though, if they had used a parallelizable engine (tile rendering, for some), and they could just add more processors as new models are requested by the market. They could also make cards upgradable.
  • by Anonymous Coward on Sunday November 09, 2003 @04:29PM (#7429962)
    After an extended conversation with a prior 3dfx employee, I learned that every chip they made for the enthusiast market was a place holder for the next gen processor. So that means Voodoo2, Voodoo3, Voodoo5 were all an effort to keep the performance lead while trying to implement a new architecture, which never saw the light of day. Their lack of execution appears to be what destroyed the company more than any other factor.

    Another note on dual chip configurations: they are not all created equal. Ati's 'Maxx' did in fact cache one entire frame of video for each chip, but chip level sli aka Voodoo5 would separate the workload down to every other pixel. What this would mean for capacity is that only the only the framebuffer was coppied 'x' number of times. In the case of 10x7x32 that is about 7 megs wasted, not half the memory capacity!
  • by bogie ( 31020 ) on Sunday November 09, 2003 @04:38PM (#7429983) Journal
    The quake3 page shows something interesting, but look at the scores on the ET page. It comes in dead last by a long shot. Looks roughly to be in GF3 range. That's kinda typical for most minority GPU vendors I've seen over the years. They can usually come close to matching entry level cards and a benchmark or two but fall down miserably on the vast majority of benchmarks. Think Kyro etc.

    Also these "3rd party" GPU vendors tend to not have the resources to implement truly state of the art functions. So while their may be some PR on advanced Pixel Shader support I'd be surprised if they can ever pull it off. I know I mentioned Kyro already, but I can just see the same thing happening. Poor performance was blamed on drivers, but then in the end we learned that the card just wasn't that good and no amount of driver tweaking could make up for its architectural flaws.

    I do welcome another player, but I'm sure not holding my breath that they will be able to compete on a technical level with the big boys. In most cases consumers will still be better off buying the previous generation of GPU's from established vendors who have all their kinks worked out and have proven that they will be around long term.

Love may laugh at locksmiths, but he has a profound respect for money bags. -- Sidney Paternoster, "The Folly of the Wise"

Working...