New Graphics Company, With Working Cards 461
gladbach writes "Toms Hardware has in their hands an actual working card, unlike other vaporware cardmakers *cough* bitboys *cough*... To quote Toms: 'A new player dares enters the graphics card market that ATi and Nvidia have dominated for so long. XGI (eXtreme Graphics Innovation), based in Taiwan of course, comes at the market leaders with a line of cards for a whole lot less money. We look at XGI's product range, and offer results of a beta model from XGIs top model Volari Duo V8 Ultra.'"
Let there be Linux support (Score:3, Insightful)
Re:Let there be Linux support (Score:2, Interesting)
Re:Let there be Linux support (Score:2)
And that's assuming they provide a compilable wrapper. If they don't, I can only use the card with the kernel versions they bother supporting, making me completely dependent on their mercy for continuing use of my card...
I'd be an idiot to buy a card that becomes useless the second the company goes bankcrupt / drops support to get me to
Re:Let there be Linux support (Score:4, Funny)
You don't say. Would you care to point out either a company that has been reduced to nothing (besides Loki -- ha!) or skyrocketed to success solely on the awesome strength of Linux users?
Re:Let there be Linux support (Score:5, Funny)
Re:Let there be Linux support (Score:3, Insightful)
Re:Let there be Linux support (Score:3, Funny)
Re:Let there be Linux support (Score:3, Insightful)
interesting (Score:2, Interesting)
Aren't you forgetting someone? (Score:5, Informative)
What about Matrox [matrox.com], who've been dominating the multiple monitor graphics card market for years?
Re:Aren't you forgetting someone? (Score:2, Informative)
They dominate scientific market too... Especially Medical imaging...
Perhelia is a hit amongst pro gfx people...
But none of them will let you play Quake III in 500 fps
Re:Aren't you forgetting someone? (Score:2, Informative)
Seriously: it sounds like Matrox is going down the tubes. They recently downgraded their card from DX9 to DX8 and have been having major problems getting new driver releases out and dealing with game compatibility. There might be a handful of 2D applications that need Matrox cards, but it's not a good general purpose solution nor anything you should even try for gaming.
Everything BUT gaming.... (Score:2)
For everything else that isn't 3D gaming, I've found Matrox cards to do plenty well and they did everything I wanted, plus a few things thei
Re:Aren't you forgetting someone? (Score:5, Interesting)
I personally find NVidia's TwinView to work a lot better, if you have a card with two outputs, which most of the GF4s and higher seem to have now.
We might as well ask about S3 [diamondmm.com] if we're asking about Matrox. Remember that great card they had a while back?
The best we can hope for is a pricewar I think. Cheaper Nvidia or ATI cards is always better.
Re:Aren't you forgetting someone? (Score:2)
For most features on dual head, ATI, nV and Matrox are pretty close although there are two or three features that some people find rediculously handy.
The one thing that Matrox can do that I haven't found in ATI or nVidia is to be able to have two signal outputs that are identical (dual head clone). That way I can have the exact same output for both my video projector and a 21" CRT
Re:Aren't you forgetting someone? (Score:5, Informative)
Re:Aren't you forgetting someone? (Score:2, Insightful)
ATI and nVidia (traditionally) had horrible video quality (blurry edges, ghosted edges due to ringing, etc)
Awhile back, I helped a friend of mine upgraded his video card to gain dual display. The first thing we tried (his first choice) was some type of GeForce4 card - it worked, and ran games plenty fast, but the image quality was not really that great particularly for high resolution desktops (Dual 19" Sony CRTs) and the coo
Re:Aren't you forgetting someone? (Score:2)
Quad DVI interface to drive LCD's at 3840 x 2400 pixels... =)
Now we're talking space!
Ok, maybe not something for those gamers out there. Don't think that the Parhelia, or any other card, has the power to run any kind of game at reasonable speeds at those resolutions.
But for any kind of serious work: WOW!
I'd love one of these! =)
Re:Aren't you forgetting someone? (Score:2)
At least for people who care about performance, anyway.
Re:Aren't you forgetting someone? (Score:2, Informative)
Re:Aren't you forgetting someone? (Score:5, Informative)
True. IMO Matrox is best for non-gaming applications. One thing a lot of people forget is that Matrox is a significant player in the digital video products that deal with video capture and real time editing. Their RT.X [matrox.com] series of DV editing products are among the best in the price range.
Re:Aren't you forgetting someone? (Score:2)
Don't even get me started!
* If you ask politely, they'll e-mail you a link to an accelerated dri
Re: (Score:3, Interesting)
Re:Aren't you forgetting someone? (Score:2)
Re:Aren't you forgetting someone? (Score:2)
Its not just matrox and nvidia tho, i just ordered 4 of the CR-
Re:Actually, no. (Score:3)
It's all to do with finding the right tool for the job. I've yet to see a decent web browser / email client / nntp client / terminal that requires fast 3D graphics. YMMV.
Re:Actually, no. (Score:3, Funny)
Tom's Hardware (Score:5, Funny)
Re:Tom's Hardware (Score:5, Funny)
Games as entertainment brings teenagers. (Score:2, Offtopic)
The problem is that the emphasis on games as entertainment, as opposed to games as technology, brings lots of teenagers to Slashdot. It's their chance to hassle adults without paying a price.
Some of those teenagers get moderation points and they moderate comments they don't understand, and flood Slashdot with garbage.
Re:Games as entertainment brings teenagers. (Score:2)
So would all the other teenagers out there. Trust me, when you reach 30, you'll look back and think "God, I was an idiot!"
It's OK, though, from what I hear when you reach 50 you look back at your 30s and say "God, I was an idiot!"
Variety, Nvidia and 3dfx (Score:5, Insightful)
Competition is the mother of invention, if necessity can't possibly be.
Seems that XGI is going after some odd designs, using the fabled 3dfx dual chip design as a way to get more bang for the buck. It's not a solution, as Tom's Hardware reveals that this results in more problems. The problem? Half-norm memory usage. *ouch*!
Still, this is the first line for XGI. I'm sure we'll see a lot more from them, if they don't go broke.
Re:Variety, Nvidia and 3dfx (Score:2)
Re:Variety, Nvidia and 3dfx (Score:3, Interesting)
I dunno. Could be a shrewd move, could be a monumental screwup
Simon.
Drivers (Score:5, Insightful)
Relief thy name is XGI (Score:5, Insightful)
And that is all I really wanted to hear. Thank goodness they won't be a fly-by-night company. This is a very welcome addition to the market. Lets also hope that they either make linux drivers or open the arch so developers can do so.
The great thing about new companies with financial legs to stand on is that they can learn from the mistakes of others without having to make them themselves as well as learn from the things done right by other companies.
I for one welcome our future (their mission statement is to be #1 by 2007) GPU overlords!
Re:Relief thy name is XGI (Score:2)
Going from the beginning to where nVidia failed (Score:5, Insightful)
Second problem is that due to the size of the card, it's not gonna fit in smaller form-factor PC's. Why they put such a huge HS on the back of the card, where there's usually not much space, versus just putting more cooling on the front of the card, where high-end users (of nVidia cards, anyway) are already accustomed to leaving a PCI slot open to make room, is beyond me.
Those two big fans they've stuck on their reference board sure aren't going help keep noise levels down, either. My (reference) Radeon 9800 Pro still beats the crap out of most cards on the market today, and it's only got a small HSF for the chipset and nothing on the memory chips. And I was still able to OC it quite a bit. If nVidia's and XGI's chips really require as much cooling as manufacturers stick on them, even on "reference" boards, they must be very inefficient chips indeed. These things aside, it's always nice to see more competition in the graphics chipset business, hopefully prices might come down a bit as a result if ATi and/or nVidia see XGI as a real contenter, rather than a wannabe like Matrox (though I don't know if they're even at the "wannabe" level any longer, considering how poor their chips are nowadays).
Re:Going from the beginning to where nVidia failed (Score:2)
Matrox & other professional cards simply look better for most things, and Nvidia doesn't support some things sub-pixel antialiasing.
An example of this: http://www.3dlabs.com/product/technology/Wildcat_ Q uality_ro.doc (It's a
Re:Going from the beginning to where nVidia failed (Score:4, Interesting)
Well, I disagree. The reason 3DFX was killed off, the partners decided that they wanted to be the only vendor for 3dfx boards, they killed off all 3rd party vendors. Then they bought STB so they could release boards, and that never panned out. So with the 3DFX limited release, nvidia expanded with new OEM partners, and easily took over the market. Vendors need to release boards, even if cheap OEM boards, they need a product to fill. 3DFX took that away from all the smaller companies. And believe it or not, lots of people buy OEM quality products. Even CompUSA and main stores carry an OEM selection selection with a sticker on the box to make it look retail.
3DFX could of lasted another 2 years with the multi-gpu design, and had another GPU in development. The Partners tanked the company, sold few a few million, and walked away. They had better visual quality at the time. OpenGL was a little flaky, but could of been fixed with drivers, they had a great developteam that moved on to other GFX companies.
I had a Voodoo5 and a GF3-ti500 both the hottest cards out at the time. The Voodoo5 with only 2 CPU's kept up almost every game. But after the drivers stopped from 3DFX, you had to move to different hardware.
This is where Sis+Trident=XGI can rock. Trident has been making OEM chipsets for years, Sis makes motherboard chipsets. With a good design, and to use Multi-GPU's to make boards faster, they can cut a good niche out of the low end market.
ATI and NVidia use the same CPU cores for most boards, they just cripple and use the GPU's that dont pass high end tests. This is why when the ATI 9500 came out, you could driver hack it to a 9700. Also they use less expensive ram, and limit the hardware. This doesnt exactly save that much money, a few bux, but the selling price can be 50-100 dollars more. Most people wont spend 400 bux on a high end GFX card, but the 99 dollar sweet spot is a big field, and if XGI can come up with a card to fill that niche, they could come out as the heavy hitter and take over some major piece of the market.
Also, back in the
Of course, after reading the article, 300 bux release, and not the top preformer, they better tweak the hell out of those drivers. Maybe multi-core could help, but it at the performance of the 9500/5700, it needs a little more performance. Maybe driver tweaks, since its still beta could bump it up. Nvidia and ATI are already tweaking the drivers like crazy if you watch the benchmarks out everymonth, so tweaking and optimizing drivers does help.
-
Power by Nvidia, pfft. Spend more time fixing bugs, and less time suckling on the nvidia tit...
XGI isn't new, its trident and Xabre (Score:5, Informative)
From webpage:
'Founded in May of 2003 and headquartered in Taipei, Taiwan, XGI pulls from a deep reservoir of engineering and design talent stemming from its acquisitions of Trident Microsystems, and Silicon Integrated Systems' graphics divisions.'
Although it says Taipei, most of the hard core engineers are in san jose, in the old trident building, right across from Fry's electronics. Go say hi, they're a nice bunch
Re:XGI isn't new, its trident and Xabre (Score:3, Interesting)
I wish them luck, but I hope that they have something up their sleeves to fix all the defficiencies found in Tom's testing.
Re:XGI isn't new, its trident and Xabre (Score:2)
Whoa!! (Score:5, Funny)
WOW!!
but - wait a minute - guess I'll wait for the Volari Quattro Triplex V12 Turbo GTI XXL
_ThAT'll be a nice graphics card!
Re:Whoa!! (Score:2)
Namely, quality and efficiency.
Then again, I'm still using a Radeon 7000 PCI, and a Voodoo3 in my two computers. So.. I can't bitch much.
Re:Whoa!! (Score:2)
See. [sgi.com] Theyre not cheap though.
Re:Whoa!! (Score:2)
Re:Whoa!! (Score:2)
"But.. but.. it will make my car^H^H^Hcomputer go faster!"
Arh! (Score:2)
Why oh why do companies insist on using SiS's chips for TV out! It's horrible!! VIA used them on the Epia boards, and it stinks! Great little boards don't get me wrong, but it's TV out is nasty. Now these guys. Will the pain never end??!
[/sis rant]
But good luck to them
Re:Arh! (Score:2)
Hello from Earth.
Watching a DVD through SiS's TV out chip is more painful than watching it through mplayers libaa output.
I used to own a Savage 3D
Ehh... best of luck to them, but.... (Score:2, Interesting)
ATI and nVidia being the only 2 real competitors in the 3D gaming card market isn't such a horrible thing. (Yes, things woul
Re:Ehh... best of luck to them, but.... (Score:5, Interesting)
The thing is, to get these marks you either cheat, or are an idiot savant, or effectively a genius. Now you put all these people in the same room, what do you get? Superior products? No.
You get ego clashes, clueless idiots, hangers-on and cheaters who couldn't design a 10ms monostable with a 555 and a book from Radio Shack. NO real-world experience, NO real skills whatsoever.
The Matrox you see today is due to universities run wild and employers being blinded by them.
Just another example of the irrelevance of university to real-world problems.
Re:Ehh... best of luck to them, but.... (Score:2)
These days, everyone writes to DirectX, period. Whether they use Direct3D or OpenGL is irrelevant, they all use DirectInput, Directsound, so on and so forth.
This of course means that a graphics card maker need only supply DirectX and OpenGL support and users can run their games on the card. Granted - card speci
A quick summary of the article for Linux users (Score:5, Informative)
Drivers haven't been tested, but LinuxHardware reports [linuxhardware.org] that Linux drivres will be available in Within the first quarter of next year. Let's just hope it doesn't suck suck and that there are some real perks of running an XGi over a GFFX5600.
Re:A quick summary of the article for Linux users (Score:2)
Correction: Drivers haven't been tested, but LinuxHardware reports that Linux drivers will be available in within the first quarter of next year. Let's just hope they don't suck and that there are some real perks of running an XGi over a GFFX5600.
Questionable if they match a 5600 (Score:3, Interesting)
Also these "3rd party" GPU vendors tend to not have the resources to implement truly state of the art functions. So while
Re:Questionable if they match a 5600 (Score:2)
I really have to wonder if they'll be able to properly pull this off. Look at nVidia and ATI's entries into the high-end graphics market; it took them a few product cycles to achieve equality and then a few more to get dominance.
Of co
Re:Questionable if they match a 5600 (Score:2)
This and open sourced drivers which are managed good enough to gather the talent out there.
50% of all linux users shifting their buying decisions to XGI cards for their next buy just because of this drivers could help them a lot at the beginning. Maybe they are lucky, and are at t
Graphs and statistics (Score:3, Interesting)
Multiply its figures by (at least) 500/450 and they look a lot better - normally just (really just, indistinguishably so) behind the leaders...
Cheaper, huh ? About time too
Simon
Re: (Score:2)
Re:It'll take someone very old to remember this... (Score:2)
I had my first accident in one of those, before I was even old enough to drive. Thought I'd do my parents a favor and back it into the garage for them before a snowstorm. Hit the side of the door with it.
If you read the article.... (Score:3, Interesting)
Re:If you read the article.... (Score:2)
Where the Radeon 9800 Pro's and XT's score 55-60 frames a second, the Duo V8 comes in at 16 fps. Not so good.
"With the current driver, the game is practically unplayable..." Ouch.
Re:If you read the article.... (Score:2)
Re:If you read the article.... (Score:3, Funny)
No thanks... (Score:4, Funny)
Tom's Hardware (Score:5, Informative)
The majority of those writing the reviews clearly have no idea what they are talking about, at least regarding the subject they are reporting on. Overall, I would rate them slightly above HotHardware.com.
Tom himself, as far as I can tell, is on the ball and knows his stuff VERY well, but he doesn't write articles much anymore, and obviously doesn't read them either.
It is a common practice among hardware enthusiasts to quote Tom's for the humor value, trying to see if the author of the latest article is even more clueless than he was in his (or her) last article.
To be fair, they do have some excellent articles occasionally, and were the first ones to dare publish information on Intel's unstable Pentium III 1.13GHz [tomshardware.com] processor, but unfortunately these seem to be the exception rather than the rule.
Also, as has already been stated, XGI is hardly a new company. Of course, these bits of SiS and Trident are in completely new territory if they are trying to compete in the high-end gamer's market. Considering that this is their first real foray into that market, I think they have done an amazing job. I'd say give them the benefit of the doubt until they prove otherwise. Remember, even the (once) most respected [nvidia.com] companies in the field can faulter, and that XGI has something that is even in the same ballpark as the most seasoned of players is an impressive feat.
I'll say it....... (Score:3, Informative)
But more importantly his reviews suck! I haven't even looked at this one but i venture to guess it's at least 15 pages, milk that advertising cash cow, tom!
If brevity is the soul of wit, then you are one dumb mother fucker Tom.
Now for the informative part of by rant:
try www.hardocp.com
or
www.anandtech.com
or
www.a
all 100% better than tom's
Re:I'll say it....... (Score:4, Funny)
You then say that [H]ard|OCP, Anandtech and Ace's Hardware are 100% better than Tom's Hardware. Fair enough, everyone is entitled to their opinions.
However - 100% of useless is still nothing, so what you're saying is, that [H]ard|OCP, Anandtech and Ace's Hardware are also useless. I don't think that's what you were going for
Re:I'll say it....... (Score:2)
Entice them to support Linux (Score:4, Interesting)
Review summary (Score:2)
Re:This was explained (Score:2)
Here, I'll save you sum time (Score:3, Funny)
The bad news: the cards suck ass [tomshardware.com].
So basically, nothing has changed since you woke up this morning.
why not parallelization ? (Score:3, Interesting)
Re:why not parallelization ? (Score:2)
While in hardware they have none, they have a strong one outside that. Price.
(personally I'd like to buy a new, decent gfx card. But all new decend gfx cards cost a fortune I'm just not ready to spend. So...
Re:why not parallelization ? (Score:2)
Radeon 9200s are currently in the $50 price range and have 'decent' performance for most things on the market (you may have to cut back resolution and IQ for some cutting edge games). For 3-4x the price ($150-200 range for all but ATI's top cards)) you can get some extra performance (less than 2x) and pick up DX9 and better IQ.
Toms hardware is running BSA ads (Score:5, Informative)
Re:Big deal (Score:3, Informative)
Feeding the trolls: Microsoft sells a product. It may be a crappy product that OEM's are strongarmed into reselling, but they're still a business. The BSA exists to shake companies down with license audits, and has jackbooted thugs literally break down doors to ensure compliance.
Slashdot is "anti-microsoft" insofar as there's a large and very vocal segment of its population consisting of semi-literate fanboys
Taiwan of course? (Score:2, Insightful)
Let's look at the other main graphics card companies--Nvidia, ATI, Matrox.
Nvidia--California based.
ATI--Canada based.
Matrox--Canada based.
Now, if we were talking motherboard manufacturers, things might be different...
Time for a low cost alternative (Score:2)
Simple Sales Guide (Score:2)
Ignore suport for Linux... and you're not even on the radar for me.
The Latin is a little off (Score:3, Informative)
1. Volatus means flight, not velocity
2. Volari appears to be closer to volare, meaning "to fly"
3. Velocity comes from the Latin velocitas (meaning quickness), which is derived from velox (meaning quick)
While the fact that the two stems (velo- and vol- for velox and volare) share initial consonants suggests that there is a relationship between these stems, this relationship is more likely to have arisen in some proto- or pre-Latin than Latin itself.
Bad news for the game industry (Score:4, Informative)
Some hardware companies were especially responsive when we found driver bugs and in general were very helpful and an absolute pleasure to work with. nVidia's developer relations team is head and shoulders above the rest.
Just about all others were very good. I don't think it would be fair to expect anything more than what all the rest gave. I'd put Matrox, Intel, ATI, 3Dlabs, and S3 in this category.
Then there was SiS and Trident. We experienced tremendous bugs with the Xabre. Numerous times I offered small code examples to reproduce these bugs. These are very big bugs; crashes if you use a 3d texture, crashes if you use certain keywords in a vertex program, incorrect rendering of primitives, etc. It was around a year before we shipped I started to contact IHVs to ensure any bugs they had could be worked out. SiS and Trident didn't give a damn.
It is one thing to develop good hardware. That is 1/3 of the task. Companies have been doing that for years. Remember the g400? Great hardware, aweful drivers. The next 1/3 is making everything run correctly. ATI, which is pretty good these days, is still fixing bugs like flipped textures in Flight Sim 2004 (which would obviously affect other games, FS04 is just the first to expose it) in their latest drivers. How mature can we expect the drivers to be from a new company? Not too good, given no new company has done it first try and these do have a track record of being aweful.
The final 1/3 of the driver is to extract maximum performance. I'm not too worried about this unless it comes before the other more important parts as generally this is some form of cheating; or as Trident called it "application specific acceleration"
I truely pitty somebody who buys one of these cards because it gets the same 3dmark score as a GeForce or Radeon, but costs $10 less. There is more to a good card than the scores it gets in the most popular benchmarked games, it is more important that it runs all the games you want. The Xabre can't run FS2004, Splinter Cell, Homeworld2, etc. Atleast Intel's integrated graphics which may not be that quick do run everything reasonably well.
Please remember this when you recommend a system for a friend. Please insist they either spend the extra money for an nVidia or ATI product or simply get a card in their price range with less raw 3dmark performance from nVidia or ATI.
You get what you pay for when it comes to software support!
Re:I smell the rotting corpse of Aureal Vortex (Score:2, Insightful)
Re:XGI = SIS + Trident (Score:3, Informative)
It is good to see the graphics chip market is not too much of a closed shop for any new companies to enter. I would like to know how they plan on getting around the many recent patented methods nVIDA and ATI share with each other, will the SiS aspect get around this? I am not anexpert, please enlighten me!
Re:XGI = SIS + Trident (Score:3, Insightful)
Re:XGI = SIS + Trident (Score:2)
I personally agree with you here, no matter what slight advantange ATI cards currently get in the latest benchmarks, NVidia's drivers are simply better, more stable, etc.
I expect Nvidia and ATI to leapfrog back and forth with their latest cards stealing the benchmark king title from eachother, but I don't expect ATI to ever make drivers I like as much as Nvidia, so I will continue to buy Nvidia's cards.
Re:XGI = SIS + Trident (Score:2)
While people have reportedly had problems in the past, Ive got to tell you, there is *nothing* wron w/ ATI.
Re:XGI = SIS + Trident (Score:2)
Re:XGI = SIS + Trident (Score:2)
Re:XGI = SIS + Trident (Score:2)
But you CAN cook an egg on an Athlon XP! This person did it!
Re:XGI = SIS + Trident (Score:2)
Re:XGI = SIS + Trident (Score:3, Interesting)
Re:XGI = SIS + Trident (Score:2)
Re:XGI = SIS + Trident (Score:2, Insightful)
Re:XGI = SIS + Trident (Score:2)
You would think, until "Card X" comes along and only implements 95% of DirectX 9.0, but claims they implemented it all, and some certain feature used by only a couple games decides to make those games randomly crash, but only on "Card X"
Matrox and S3 cards always seemed to work like that to me in the past.
Re:XGI = SIS + Trident (Score:5, Insightful)
Then, I discovered, upon using a couple of computers that had reference boards, rather than Diamond-enhanced boards.. that the reference boards, with the reference drivers were an order of magnitude better, faster, more stable, than what I had believed from teh Diamond junk.
Just because XGI is a "new player" (with experienced hands) and the beta card sucks and the beta drivers suck.. doesn't mean that they can't make quality out of it.
Dont feed the trolls (Score:2)
Re:eXtreme (Score:2)
So it should be SGI... but I think someone might be upset if they did that.