New Graphics Company, With Working Cards 461
gladbach writes "Toms Hardware has in their hands an actual working card, unlike other vaporware cardmakers *cough* bitboys *cough*... To quote Toms: 'A new player dares enters the graphics card market that ATi and Nvidia have dominated for so long. XGI (eXtreme Graphics Innovation), based in Taiwan of course, comes at the market leaders with a line of cards for a whole lot less money. We look at XGI's product range, and offer results of a beta model from XGIs top model Volari Duo V8 Ultra.'"
interesting (Score:2, Interesting)
Re:Aren't you forgetting someone? (Score:5, Interesting)
I personally find NVidia's TwinView to work a lot better, if you have a card with two outputs, which most of the GF4s and higher seem to have now.
We might as well ask about S3 [diamondmm.com] if we're asking about Matrox. Remember that great card they had a while back?
The best we can hope for is a pricewar I think. Cheaper Nvidia or ATI cards is always better.
Ehh... best of luck to them, but.... (Score:2, Interesting)
ATI and nVidia being the only 2 real competitors in the 3D gaming card market isn't such a horrible thing. (Yes, things would be MUCH worse if we only had ONE major player.) As long as there are two, it keeps innovation and competition alive between them - and it reduces the support headaches for software developers.
It wasn't THAT long ago, you almost had to select your 3D card purchase based on how much you liked the "special edition" titles bundled in the box - because you never knew how many future games would actually be written to support the thing! (I recall buying a Matrox Mystique like this, for example. It came with a Mystique-accelerated version of MechWarrior and a few other decent games - but I barely found anything else that supported it afterwards.)
The old line about "too many cooks spoiling the soup" holds true for too many competing brands on the video card marketplace, too.
Graphs and statistics (Score:3, Interesting)
Multiply its figures by (at least) 500/450 and they look a lot better - normally just (really just, indistinguishably so) behind the leaders...
Cheaper, huh ? About time too
Simon
Comment removed (Score:3, Interesting)
Re:Variety, Nvidia and 3dfx (Score:3, Interesting)
I dunno. Could be a shrewd move, could be a monumental screwup
Simon.
If you read the article.... (Score:3, Interesting)
Re:Let there be Linux support (Score:2, Interesting)
Re:XGI isn't new, its trident and Xabre (Score:3, Interesting)
I wish them luck, but I hope that they have something up their sleeves to fix all the defficiencies found in Tom's testing.
Re:XGI = SIS + Trident (Score:3, Interesting)
Re:Going from the beginning to where nVidia failed (Score:1, Interesting)
Re:Going from the beginning to where nVidia failed (Score:4, Interesting)
Well, I disagree. The reason 3DFX was killed off, the partners decided that they wanted to be the only vendor for 3dfx boards, they killed off all 3rd party vendors. Then they bought STB so they could release boards, and that never panned out. So with the 3DFX limited release, nvidia expanded with new OEM partners, and easily took over the market. Vendors need to release boards, even if cheap OEM boards, they need a product to fill. 3DFX took that away from all the smaller companies. And believe it or not, lots of people buy OEM quality products. Even CompUSA and main stores carry an OEM selection selection with a sticker on the box to make it look retail.
3DFX could of lasted another 2 years with the multi-gpu design, and had another GPU in development. The Partners tanked the company, sold few a few million, and walked away. They had better visual quality at the time. OpenGL was a little flaky, but could of been fixed with drivers, they had a great developteam that moved on to other GFX companies.
I had a Voodoo5 and a GF3-ti500 both the hottest cards out at the time. The Voodoo5 with only 2 CPU's kept up almost every game. But after the drivers stopped from 3DFX, you had to move to different hardware.
This is where Sis+Trident=XGI can rock. Trident has been making OEM chipsets for years, Sis makes motherboard chipsets. With a good design, and to use Multi-GPU's to make boards faster, they can cut a good niche out of the low end market.
ATI and NVidia use the same CPU cores for most boards, they just cripple and use the GPU's that dont pass high end tests. This is why when the ATI 9500 came out, you could driver hack it to a 9700. Also they use less expensive ram, and limit the hardware. This doesnt exactly save that much money, a few bux, but the selling price can be 50-100 dollars more. Most people wont spend 400 bux on a high end GFX card, but the 99 dollar sweet spot is a big field, and if XGI can come up with a card to fill that niche, they could come out as the heavy hitter and take over some major piece of the market.
Also, back in the
Of course, after reading the article, 300 bux release, and not the top preformer, they better tweak the hell out of those drivers. Maybe multi-core could help, but it at the performance of the 9500/5700, it needs a little more performance. Maybe driver tweaks, since its still beta could bump it up. Nvidia and ATI are already tweaking the drivers like crazy if you watch the benchmarks out everymonth, so tweaking and optimizing drivers does help.
-
Power by Nvidia, pfft. Spend more time fixing bugs, and less time suckling on the nvidia tit...
Re:Ehh... best of luck to them, but.... (Score:5, Interesting)
The thing is, to get these marks you either cheat, or are an idiot savant, or effectively a genius. Now you put all these people in the same room, what do you get? Superior products? No.
You get ego clashes, clueless idiots, hangers-on and cheaters who couldn't design a 10ms monostable with a 555 and a book from Radio Shack. NO real-world experience, NO real skills whatsoever.
The Matrox you see today is due to universities run wild and employers being blinded by them.
Just another example of the irrelevance of university to real-world problems.
Entice them to support Linux (Score:4, Interesting)
why not parallelization ? (Score:3, Interesting)
Re:Going from the beginning to where nVidia failed (Score:1, Interesting)
Another note on dual chip configurations: they are not all created equal. Ati's 'Maxx' did in fact cache one entire frame of video for each chip, but chip level sli aka Voodoo5 would separate the workload down to every other pixel. What this would mean for capacity is that only the only the framebuffer was coppied 'x' number of times. In the case of 10x7x32 that is about 7 megs wasted, not half the memory capacity!
Questionable if they match a 5600 (Score:3, Interesting)
Also these "3rd party" GPU vendors tend to not have the resources to implement truly state of the art functions. So while their may be some PR on advanced Pixel Shader support I'd be surprised if they can ever pull it off. I know I mentioned Kyro already, but I can just see the same thing happening. Poor performance was blamed on drivers, but then in the end we learned that the card just wasn't that good and no amount of driver tweaking could make up for its architectural flaws.
I do welcome another player, but I'm sure not holding my breath that they will be able to compete on a technical level with the big boys. In most cases consumers will still be better off buying the previous generation of GPU's from established vendors who have all their kinks worked out and have proven that they will be around long term.