Ask NVIDIA Interview 101
A reader writes: "There's a pretty lengthy interview with NVIDIA, which covers many interesting current topics, including the Xbox, BeOS support, Mac support and the NV20." And they covered more quality control - that's been my major problem with the cards.
Tweaking Your Cards (Score:1)
Where can I find the latest and greatest tricks and techniques for tweaking your cards? Thanks.
Re:Which geforce 2? (Score:1)
Re:Why Should I? (Score:5)
And for anyone who runs anything other than (or in addition to) windows and linux, then just about any other card is better, since it will probably work.
Besides, even assuming the drivers wouldn't crash my machine willy-nilly, I have better things to do than fight my package system to manually graft in these ridiculous drivers into what is otherwise a well- and tightly integrated system.
As always, it depends on what you have and what you do, but for me, their drivers are not an option.
--
Change is inevitable.
Re:3dfx (Score:2)
Check the updates. They've had D3D and OGL support for Unreal 1 for some time.
Re:Why Should I? (Score:2)
This is not true. T&L support for the radeon is on the way.
This is also not true. The specs are under NDA but the DRI developers have them.
Re:Why Should I? (Score:2)
And this one deserves special comment, because it's borderline insulting. From dri-devel, MGA400 driver that Gareth has just done some work on:
The DRI is already pushing the limits of most of the hardware it supports. Mesa has recently gone through significant re-architecture work to prepare for the limits of future hardware.
3dfx (Score:1)
Granted, I haven't checked for updates in a long time (its been about a year since I finished Unreal), but it would be cool to play some of those "Voodoo or Software Only" games on another card.
--
Re:Buy Matrox or ATI Instead (Score:1)
But which? Re: Buy Matrox or ATI Instead (Score:1)
But should I buy ATI or Matrox? I don't play many 3D games, but I do want to get decent performance when I do. I also want good 2D because that's what I use most (I want AA text too
So my question is, which is "better" for me, ATI or Matrox? I appreciate that ATI has faster 3D, but is it as well supported in XFree 4.0.2? How does its 2D compare to Matrox? If the ATI has better 3D and comparable 2D, why did so many people mention Matrox and so few ATI in the recent Ask Slashdot on this topic?
Help?
Thanks,
Stuart.
Re:More serious Mac issues (Score:2)
Tell me something. How many times has Apple promised to revolutionize MacOS, and how many times have they shelved the revolutionary version in favor of hacking up the old 1980's technology again?
When will the source code obfuscation end? (Score:2)
I don't want to get into the details of how this violates the xfree86 license, or why you may or may not want to do such a thing. I just want to ask one specific question.
Now that you've crushed the competition, when might you consider laying off this practice?
The danger: NVidia close relations with Microsoft. (Score:1)
But will it stay the same? Who knows?
Why NVidia doesn't support BeOS ? They won't even allow Be to write their own driver!
I find this VERY disturbing..
I'm trying currently to choose a new videocard and I think that a Radeon might be a saffer bet.
Re:Why Should I? (Score:1)
You play that "I'm happy with what I got, sorry it doesn't work for you" game and seek sympathy because you want the "Best performance out of my gear." Well, that doesn't work with nVidia because their gear is closed.
Hell, the GPL grew out of a conflict between RMS and Xerox over printer drivers... What is so different?
I won't cry if you use Windows to play your games. I'll keep whacking away on Open Source drivers.
Pan
Re:Why Should I? (Score:1)
Pan
sdgkjdsfa (Score:1)
Please tell me I'm not the only one that thought this was a slashdot interview...
--K
Support for *BSD? (Score:2)
It's hard for me to buy a card if my free platform of choice is unsupported.
--K
Serious Clue Issues (Score:2)
Well, apparently Apple thinks so, as you can pre-order it (ships 03/24) in the Apple Store...
However, since you're obviously extremely brilliant and clueful
(as demonstrated by your posting), they're probably wrong, and you are likely right.
--K
Linux Tunnel Vision (Score:4)
I see it as follows: I can't use NVIDIA cards for 3D. Period.
I really wish the 'L33n00x !z k3w1' crowd would realize that Linux is not the only free OS out there.
Carmack has said himself that when the next Doom game comes out in a test release, it will be nVidia only for Linux.
That doesn't really sound like Carmack. From his postings to slashdot,
he sounds like he supports interoperability through OpenGL,
'course it may be that only NVIDIA cards support necessary OpenGL extensions, or it'll be NVIDIA only in just the test release.
Regardless, my next card will prolly be a Matrox.
Yeah, the 3D is pokey compared to NVIDIA's, but Matrox 2D quality supposedly can't be beat,
and the 3D drivers are open.
If I bought a GEForce, I'd essentially be buying an overpriced, inferior 2D card.
--K
tile based rendering no support under linux (Score:1)
I belive the Nvidia took <B>alot of IP</B> from 3dfx
what about the gigapixel IP
tile based rendering is often better solution such an solution is powerVR
everything is going towards LOW power LOW bandwidth
(these are both subjective low compared to now not before)
often better solution such an solution is powerVR
everything is going on chip ( SOC )
expalintion of PowerVR
PowerVR's unique tile-based rendering approach and on-chip Z and Frame buffer processing drastically reduces memory bandwidth leading to a scalable and significantly lower-cost graphics solution than traditional 3D approaches and enabling new applications for mobile digital devices, set-top-boxes and Internet appliances. PowerVR is uniquely capable of empowering high-performance graphics on consumer devices. PowerVR's patented low-bandwidth architecture is essential to provide high quality digital graphics in affordable consumer electronics solutions. Traditional 3D architectures simply cannot provide comparable graphics processing power at an affordable cost.
yes its marketing speak but its true
regards
john jones
Re:DVI Support (Score:1)
(--) PCI:*(1:0:0) NVidia GeForce2 MX rev 161, Mem @ 0xe0000000/24, 0xd8000000/27
And it works just dandy!
DVI support for SGI's 1600SW (Score:2)
Does anyone know if there be a Dual DVI Nvidia DVI card that works with the 1600SW? Quad?
This screen requires a digital transmitter with lots of bandwidth and some cards with outboard transmitters won't work with it (eg IBM's Riva TNT2 M64 DVI-I which has a Silicon Image 154).
Cards that will support this screen are: Matrox G400/DVI, Hercules/ Guillemot Geforce1 DDR-DVI (PCI !!!), Geforce2 MXs with an outboard Transmitter, Geforce2 Pros with an outboard transmitter and Geforce2 Ultra AND Nvidia Quadro cards like the SGI V3/VR3. Do not bother with GeForce2 GTS/DVI cards. They will not work. They have an onboard transmitter that only supports 10x7 screen bandwidths.
Currently I am using the Asus AGP-V7100/2V1D working with the 1600sw and multilink adapter on Mandrake Linux + XFree86 4.0.1 + NVidia's
binary drivers. It works well except the console looks ugly (in most modes Grub lets me pick). Without using the FBConsole is there any hope for this console support? And it was a bit of a hassel getting the current binaries working in X... But it looks great.
I'm tied to ATI & matrox for exactly this reason. (Score:1)
But the truth is, ATI and Matrox want the Linux market because it isn't as 3d demanding, and they aren't so competitive in that area. NVidia can focus on the 3d performance markets.
--
Why Should I? (Score:5)
Why should I? As a user of Linux who does not play the "Open Source Or Die" game with my hardware and drivers, please give me a good reason as to why I should do this! From my vantage point, I see it as follows:
Bryan R.
Re:Linux Tunnel Vision (Score:2)
It's true the 3D performance really isn't there. The framerates I was getting with Q3 under Win2k were about 20% less than my brother running win98 with a GF256 (SDR). I think the 20% could be attributed to the poor win2k drivers (at that time - can't speek about current drivers for windows). For the money, you could've gotten about 50% greater FPS, but if your priorities are like mine (and it sounds like they are), dualhead, insane resolutions and refresh rates, and pure 2D gorgeousness are considerably more important.
I'm unfamiliar with the state of ATI's linux drivers, but if they're of good quality, and gaming is somewhat important and dualhead is not, I would recommend a radeon of some sort. Their 2D quality is in the same class as the G400's, and similarly feature-rich (it doesn't hurt that the design is rather newer, either).
Re:Linux Tunnel Vision (Score:1)
Lies. My MX board kicks ass.
Kagenin
Re:Buy Matrox or ATI Instead (Score:1)
No BeOS? No explanation? These guys suck (Score:1)
Re:Attitude Towards BeOS (Score:1)
Personally I don't see it but I'm probably used to her by now. Either way, their answer was pretty blunt. Overly blunt. Hmmm... nVidia's partnership with Mircrosoft must be wearing off on them.
Re:In other news... (Score:4)
By the wording of NVidia's answers I have been left with an overwhelming feeling that any answers from developers have been significantly mangled by their marketing and/or PR department.
Reading the aritcle I was disappointed at how curt they were with answering potential "meaty" developer questions.
What does NVidia wish to achieve with the interview?
Generate interest in their products for future purchase.
Who reads Sharky Extreme?
Hardcore computer users.
Do the responses from the interview generate more (buying) interest in Sharky Extreme readers?
No. I can't speak for all, but I feel Nvidia side-stepped many of the questions and I was un-impressed with the quality of answers.
I love their products, but find their PR representatives doing them a disservice.
Re:Buy Matrox or ATI Instead (Score:1)
I specifically have the cursed configuration: A7V, 1 Gig Athlon, Creative SB Live! It seems that the raid version of the Asus socket A works fine. I'm going to try updating the bios to the latest version listed, otherwise I return it.:(
Thanks to tomsharware.com
DVI Support (Score:1)
ATI and Matrox seem to have it working, why not nVidia? I want a dual output DVI GeForce board! Why is that so hard?
Re:Buy Matrox or ATI Instead (Score:2)
ck
Re:Linux Tunnel Vision (Score:2)
For some real life perspective on that, my (former) roomate and I have the exact same monitor (Sony G400 (19" Flat Trinitron)). He has a Matrox G400, and I have a Hercules GeForce2 GTS 64 Mb.
In 2d, neither of us can tell a difference in image quality, either in windows or linux (no bsd on here yet).
However, in 3d applications, he finds that sofware rendering is faster for games that aren't supported by Matrox' turbogl. Obviously, that's not cool.
The rumoured 2d difference is negligible in my experience (if it even exists), and nvidia's 3d power just kicks ass over everything else. GeForce2 MX's are now going for ~$99, so I'd think it silly to get something else.
this is dead (Score:1)
Ouch. (Score:1)
BeOS gets the finger.
Dirk
Re:Open/Free Hardware vs. Open/Free Software. (Score:1)
Re:Why Should I? (Score:1)
This is not true. T&L support for the radeon is on the way.
I said likely remain the case for some time, not "it will never happen." If work has just begun on the T&L driver, chances are it will take a while. In all likelihood, there will be a new generation of chips out by the time it's stable. If there is a target date, I would be interested in hearing it - "on the way" can mean anything.
The DRI Radeon drivers do not support T&L, and ATI is not releasing the necessary info for the developers to integrate it.
This is also not true. The specs are under NDA but the DRI developers have them.
Then this is a very recent development. Less than a month ago, VA was "in negotiations" to do a T&L driver, and the "details" could not be discussed. It couldn't even be mentioned whether the driver would support 3D textures.
-Mark
Re:Why Should I? (Score:1)
The only halfway interesting feature of the V5 is FSAA w/ SLI, but that still isn't supported.
The only truly interesting piece of modern hardware with DRI drivers in progress is the Radeon, but support for this chip is still severely lacking.
Don't be so sensitive. People are extremely vocal picking out flaws with the NVIDIA drivers. But critisize the DRI and people get all huffy, even if there are valid complaints about the level of support/timeliness of release/difficulty of installation, etc. "Yeah, but they're open source" doesn't cut it when you need to get real work done. Yes, the NVIDIA drivers have some problems, but so do the DRI drivers.
-Mark
Re:Why Should I? (Score:2)
This is totally false. I have two SMP machines at home, and we've got 3 SMP machines at work that are all using the NVIDIA drivers. They are very fast, and very stable. The NVIDIA drivers are the only drivers which support T&L under Linux, and this will likely remain the case for some time. The DRI Radeon drivers do not support T&L, and ATI is not releasing the necessary info for the developers to integrate it. So, buy a Radeon and you're only going to get the features of a Rage128. The NVIDIA drivers fully exploit their hardware - the same cannot be said about any of the open source drivers at the present time.
People complain about how 'difficult' it is to install the NVIDIA drivers. If people actually read the install instructions, installing them is trivial. Before you complain about the difficulty of installing the NVIDIA drivers, why don't you try installing the DRI drivers from scratch.
-Mark
In other news... (Score:2)
It's a good thing, too, because I'd hate to see those sludge-talk skills go to waste. In response to a few dozen direct, eloquent questions, they let slip the following valuable insights:
In my experience, there are two things you can always count on with this company: (1) that their products will be great, and (2) that anything they say is so full of crap that it's not worth the paper it's not printed on, much less the time needed to read it.
cheers,
mike
Re:Linux Tunnel Vision (Score:1)
http://daily.daemonnews.org/view_story.php3?story
My advice is to sign the petition [m87-blackhole.org] and hope they listen and finish up the work.
Re:Linux Tunnel Vision (Score:2)
Re:Why Should I? (Score:2)
1.) By relying on a binary-only driver that must run with root privledges, your system can no longer be trusted. You don't know what that driver contains. You don't know if it contains something that could compromise your entire system's security. You don't know if it contains an obscure bug that could bring down your whole system and might never be fixed because there aren't enough eyes probing the code.
>>>>>>>>
Good god, I'm not running an NSA server here, just my desktop machine! Second, I'm sure that all the "eyes probing the code" has made GNOME the paragon of stability that it is (tongue in cheek)
2.) Any company that refuses to open source their hardware drivers clearly does not understand and support the Open Source movement. Such companies, after this much time, are unlikely to change. To use their products is to be forever stuck with a proprietary solution. And what happens when the company phases out driver development for older products? You are now stuck with a binary driver that ONLY works with a specific, outdated Open Source version. Lets say, hypothetically, that tomorrow NVidia stopped developing the GeForce drivers for XF86. Would you be satisfied running XF 4.0.2 for the rest of your video card's useful life?
>>>>>>>>>>
That's funny, specifically when said in reference to NVIDIA. NVIDIA is still providing driver updates for the Riva128. The card is three years old. If you're stilling using hardware that old, you have no right to complain about lack of software support. Also, Win98 runs perfectly well with the 5 to 6 year old Rage II drivers, so if XFree86 5.0 isn't compatible, blame XFree, not NVIDIA. (Not to mention Linus and his driver API of the day games)
3.) To use an old saying, "Slow and steady wins the race.." Sure a closed source driver may offer an adequate solution *right now*, but an open source driver will inevitably surpass the closed one in quality in the near future.
>>>>>>>>>>
Which is exactly why GNOME totally whips NT4's ass in GUI speed. Not. Face it, OSS isn't nearly of the panacea of software development that its cracked up to be. Properly done, OSS can be a big boost for a software project. It just doesn't do miracles, that's all.
That is an overview for all hardware drivers. Now what about NVidia vs. ATI/Matrox? Consider that ATI and Matrox cards are generally accepted as having higher quality RAMDAC's which lead to better 2D image quality (cleaner analog signal). Furthermore, I believe the Radeon DDR bests the GeForce2 GTS in 32-bit at high resolutions by a significant margin.
>>>>>>>>>>>
Uh, no. Where do you get your info? While its true that both the Radeon and Matrox cards do *look* better, the GF2GTS is more than 10-15% faster than a Radeon 64DDR in most games (even at high res.) For a while there, the Radeon beat The GTS in Quake III at 16kx12k, but after the Detonator 3 drivers, NVIDIA came up big time.
Re:But which? Re: Buy Matrox or ATI Instead (Score:2)
Re:More serious Mac issues (Score:2)
B) AGP 8x? Please! There was a 3 year gap between the release of AGP 2X (the LX chipset) and AGP 4X (the 820 chipset) AGP 8X is still a few years away!
Re:Which geforce 2? (Score:2)
Re:Why Should I? (Score:2)
Last feature update for XFree was in 3.3.something, when NVidia switched to new architecture for their open drivers.
>>>>>>>>
Whoops, my mistake. Sorry, the BeOS Unified NVIDIA Drivers support the Riva128 through GeForce2 GTS. I just assumed Detonator 3 did as well. Still, the TNT-1 is an aweful old chip to still see driver updates...
Riva128 under X still can't calculate timings correctly - my GTF calculated Modeline for 1280x1024@85Hz doesn't work - it gives 80Hz. With some tweaking I was able to get 81Hz - still far from 85Hz. Riva128 driver does not support XVideo extension - altrough hardware is capable of doing so.
The glx module for 3.3.3/3.3.5 is a joke - I was not even able to run gl-screensaver with it - it crashed whole X (that's my whole need for 3D - screensavers and occasionaly games
like chromium).
>>>>>>>>>
Aren't any problems in 3.3.x technically the problem of the XFree guys and their drivers?
BTW., when I bought the card, nowhere on the box was written "You can use this card only until Feb, 1999, later you must buy new one". The card still works, the chip has functions I need, but I just can't use it.
>>>>>>>>
Yea, but you also have to remember that nowhere on the box did it say "Linux supported." You bought a card that dates back to before Linux was on CNN, and you shouldn't be surprised if it is not supported on Linux.
And finally, if the specs were available, you could still use your favourite OS in years to come(you are Be fan, right?).
>>>>>>>>>>
For BeOS, I'm switching to Radeon II. I'm pissed at NVIDIA for not giving Be the specs under NDA, but I can understand their reasons for doing so. I can get as mad as I want at Linus for not tuning Linux for media performance only, but he doesn't want to for his own reasons, I have to respect it and use something else.
So I can choose card that looks better, supports XV extension and performs decently in 3D (Matrox, ATI), or card that offers only 2D with open drivers without any extensions (GF2xx). Guess what will NOT be my next card.
>>>>>>>>>>
Huh? The Radeon looks richer in 2D. That's fine. HOwever, the Radeon is barely supported in XFree 4.0, and its 3D performance is limp compared to NVIDIAs. I see the option this way. I could be an OSS bigot (and for some things, it makes sense to be one) and use a slower, less fully featured card, or I can use a card with the best 3D acceleration (if performance counts at all, how is Matrox even an option?), double the 2D speed, and nice, stable (for the most part) drivers.
Re:NVIDIA sucks BADLY - We NEED BeOS support! (Score:1)
A polite petition would probably work better.
*shrugs*
The question they missed: (Score:2)
Followup: If so, do they want the death of Nvidia or just your firstborn sons?
Open/Free Hardware vs. Open/Free Software. (Score:1)
1.) In the vast majority of the consumer world (Meaning the Windows, Macintosh, and UNIX), not only are the drivers closed-source, but this is an acceptable practice. Drivers are provided free of charge (gratis); drivers aren't a source of income.
2.) To have a truly 'Open' 3D Driver of high performance, it is necessary to include the 3D code, or at least the hardware interface to the Graphics API (in the case of Linux, Mesa, or OpenGL). nVIDIA chose to use certified OpenGL, not Mesa. I'm not entirely current on Mesa's status; nevertheless, OpenGL is seen as a better option to many buyers, as it is a guaranteed "OpenGL ARB"-compliant implementation, rather than a non-"OpenGL ARB" certified implementation of OpenGL. A certified OpenGL implementation cannot, I believe, be released as 'free' software.
3.) Having the hardware interface for their card published (as Open-Source) makes it easier to reverse-engineer the hardware. ATI wouldn't care about doing so, but about every other manufacturer would, as even Matrox's 3D is lacking. This is essentially a 'security-through obscurity' scheme, but in the hardware world, it's all you have.
4.) nVIDIA is a *HARDWARE* company. R&D on hardware isn't anywhere near the curve that it is on software. It's much more difficult to design a chip than write code. Chip fab is so obscenely expensive that it isn't really possible to have the Free-Software equivalent of hardware. Unlike software, the machines & tooling to fab hardware can't be duplicated on a whim, and they are extremely expensive. The source materials also cost money (although trivial after machinery & tooling costs). The tooling to make the chip is at least $250k... forget energy, materials, labor, environmental regulations, etc. costs accumulated in physical production. So if there are any bugs, it costs millions.
Too many people don't realize the difference between hardware and software when it comes to Intellectual Property. The software business gets many rights that hardware makers don't; and they have far fewer of the penalties.
Free software takes only a compiler, time, and a coder to generate a useful, reliable, high-quality product.
You *CAN'T* do that with hardware. But, we'll suppose you have a Free Hardware design.
Now 'compile' it from the design code to a useful, reliable, high-quality chunk of silicon.
... For under $250,000.00 US.
Free software exists because it's inexpensive enough to be a hobby... a passion. All it takes is time and effort.
Free hardware can't say that. Only Billionaires have the resources to create free hardware.
So give nVIDIA some credit and a chance to get a return on their investment in hardware and tooling to create the chip. The only ones who are really able to take advantage of an open-source driver are the other hardware manufacturers, who would use it to reverse-engineer the hardware, or the users of an OS without a driver. Since Windows, Mac, and Linux covers all but the smallest part of the OS world that uses 3D, and he *BSD clones are used (and advocated) primarily as a server that doesn't even need X... not a graphics workstation or gaming station, it shouldn't even be an issue to anyone... except for QNX and BeOS users.
Why did you break power management... (Score:1)
Re:Why Should I? (Score:3)
1998 computer bought, only drivers available for Windows NT
1999 Drivers released for Windows 95/98, shortly followed by _very_ buggy Linux drivers.
Right now, nVidea stopped their development for the riva128 chipset on Linux (which means it "probably doesn't work"). No support for any videocard under FreeBSD or any other OS besides Linux/NT/98/95. No specs opened, many developers who are *willing* to introduce this chipset only if they had the specs.
Results: when XFree86 developers or Linux kernel developers are willing to change their implementation the nVidea drivers are likely to be incompatible.
And no, I'm not a gamer, I'm just asking for OpenGL hardware acceleration on my system. Tell me why I should stay with nVidea?
Re:Why Should I? (Score:1)
1.) By relying on a binary-only driver that must run with root privledges, your system can no longer be trusted. You don't know what that driver contains. You don't know if it contains something that could compromise your entire system's security. You don't know if it contains an obscure bug that could bring down your whole system and might never be fixed because there aren't enough eyes probing the code.
2.) Any company that refuses to open source their hardware drivers clearly does not understand and support the Open Source movement. Such companies, after this much time, are unlikely to change. To use their products is to be forever stuck with a proprietary solution. And what happens when the company phases out driver development for older products? You are now stuck with a binary driver that ONLY works with a specific, outdated Open Source version. Lets say, hypothetically, that tomorrow NVidia stopped developing the GeForce drivers for XF86. Would you be satisfied running XF 4.0.2 for the rest of your video card's useful life?
3.) To use an old saying, "Slow and steady wins the race.." Sure a closed source driver may offer an adequate solution *right now*, but an open source driver will inevitably surpass the closed one in quality in the near future.
That is an overview for all hardware drivers. Now what about NVidia vs. ATI/Matrox? Consider that ATI and Matrox cards are generally accepted as having higher quality RAMDAC's which lead to better 2D image quality (cleaner analog signal). Furthermore, I believe the Radeon DDR bests the GeForce2 GTS in 32-bit at high resolutions by a significant margin.
Just my $0.02. Please also read Eric Raymond's "The Magic Cauldron." (Especially the last section about open source drivers)
More serious Mac issues (Score:2)
Which geforce 2? (Score:2)
Re:More serious Mac issues (Score:2)
Re:Tweaking Your Cards (Score:2)
True, but only if the framerate is sustained (say, from a movie projector, or high-end graphics gear -- SGI or E&S). Personally I have my gaming rig set up with vsync on, it rarely dips below my refresh rate (72 Hz) locked on solid. Plus with vsync enabled, I don't get the screen "tearing" that comes with ultra-high framerates and lots of action.
Linux ... (Score:1)
Will Microsoft allow game developers to use Linux?
Re:Interview In a Nutshell... (Score:1)
Why did Nvidia buy 3dFX (Score:2)
Well why the hell did they buy 3DFX then? Was it just to take out a competitor?
Re:Why Should I? (Score:1)
But, at home, where I'm in control of what goes inside my computer, I prefer OSS enabled hardware and use Matrox.
3dfx and open source (Score:2)
What is to become now of the 3dfx opensource effort, given nVidia's anti-opensource leanings? The linux.3dfx.com site is gone, and the placeholder 3dfx.com site explicitly lists nothing for linux drivers.
What has become of the existing code? I know some of it has been merged into the XFree86 trees, but the rest?
Re:Buy Matrox or ATI Instead (Score:1)
Re:Buy Matrox or ATI Instead (Score:1)
Re:Buy Matrox or ATI Instead (Score:3)
Re:Buy Matrox or ATI Instead (Score:1)
NVidia comesout with PRFlack 2.0/drivers unchanged (Score:1)
Hell, I'd at least expect them to say something about when the NV 20 is due.
Re:Buy Matrox or ATI Instead (Score:1)
Re:Buy Matrox or ATI Instead (Score:1)
From a site called Sharky's... (Score:1)
47.5% Slashdot Pure(52.5% Corrupt)
Re:Buy Matrox or ATI Instead (Score:1)
I have a GeForce2 MX with 32 megs of RAM. I also have an ATI Radeon 64 Meg DDR. Even under linux, the Radeon beats the GeForce, both in terms of performace (with Q3A) and quality, IMHO.
Now all we need is for the DRI drivers for the Radeon to use the T&L unit on the card.
Ranessin
Re:Buy Matrox or ATI Instead (Score:1)
Not all that more expensive. I think it was maybe $50 more.
Raneesin
Re:Why Should I? (Score:1)
nVidia's drivers are fine as long as you don't have a problem. If you do, you're screwed.
With the DRI drivers, if you have a problem or uncover a bug, just ask the DRI developers and it's usually fixed in a timely fashion.
Ranessin
Re:Buy Matrox or ATI Instead (Score:1)
Where did I say anything about the cause of the performance difference? Hell, I didn't even mention the phrase "open source" anywhere in my post. Yep folks, we've got a real genious here.
Ranessin
Re:When will the source code obfuscation end? (Score:1)
The code for nVidia drivers in XFree86 is *not* obfuscated. At one point it was, but now it isn't. As I recall, the obfuscated code was replaced in 3.3.5.
Now, if you're talking about the linux drivers they developed in house: they're not obfuscated, their closed source.
Ranessin
Re:Buy Matrox or ATI Instead (Score:1)
Frankly, cards should be compared, performance wise, if their price is in the same range, shouldn't they? What do I care if the MX and Radeon are two different classes if they cost me approximately the same.
I got ripped off for getting a great price on two cards? Can I have some of what you're smoking?
Ranessin
Re:Buy Matrox or ATI Instead (Score:1)
As a consumer, I could give a shit if the cards I compared are in the same league in terms of performance. What I care about is if they're in the same league in terms of price. They were (and yes, I got the Radeon for a good price).
Ranessin
Re:Buy Matrox or ATI Instead (Score:1)
All one has to do is some research and they can find a *very* good price for a Radeon 64 meg DDR.
Ranessin
Re:Buy Matrox or ATI Instead (Score:1)
It's the normal state of things if you know how to shop, moron.
Ranessin
Interview In a Nutshell... (Score:2)
NVIDIA: NVIDIA is the world leader in graphics solutions.
--
Why open source drivers dont exsist. (Score:2)
--
Re:Why Should I? (Score:1)
Could you give the URL? As an owner of the original Riva128, I'm really interested in driver update.
Last W95 driver is from February 1999 (It is W95 driver, not WDM driver!).
Last feature update for XFree was in 3.3.something, when NVidia switched to new architecture for their open drivers. Riva128 under X still can't calculate timings correctly - my GTF calculated Modeline for 1280x1024@85Hz doesn't work - it gives 80Hz. With some tweaking I was able to get 81Hz - still far from 85Hz. Riva128 driver does not support XVideo extension - altrough hardware is capable of doing so.
The glx module for 3.3.3/3.3.5 is a joke - I was not even able to run gl-screensaver with it - it crashed whole X (that's my whole need for 3D - screensavers and occasionaly games like chromium).
BTW., when I bought the card, nowhere on the box was written "You can use this card only until Feb, 1999, later you must buy new one". The card still works, the chip has functions I need, but I just can't use it.
And finally, if the specs were available, you could still use your favourite OS in years to come(you are Be fan, right?).
Uh, no. Where do you get your info? While its true that both the Radeon and Matrox cards do *look* better, the GF2GTS is more than 10-15% faster than a Radeon 64DDR in most games (even at high res.) For a while there, the Radeon beat The GTS in Quake III at 16kx12k, but after the Detonator 3 drivers, NVIDIA came up big time.
So I can choose card that looks better, supports XV extension and performs decently in 3D (Matrox, ATI), or card that offers only 2D with open drivers without any extensions (GF2xx). Guess what will NOT be my next card.
Re:Why Should I? (Score:1)
Do you realize that BeOS Unified NV Driver is port of XFree driver?
Still, the TNT-1 is an aweful old chip to still see driver updates...
When introducing Riva series, they (NVidia) claimed that they designed hardware architecture that allowed them to be forward compatible, so you could use new drivers for old card and vice-versa without any problems.
Also, NVidia does not need to make updates for their older cards - they will certainly not make it forever. However, they should publish the specs for people, that are interested/able to self-support their own hardware. See for example GUS support in Linux (GUS = Gravis Ultrasound, sound card that was produced until 1994. It is good soundcard even for today's requirements and it is still supported in Linux. This is possible because Gravis published specifications). And it is certainly older than TNT :-). On the other hand, see HP and some of their scanners under W2K - totally unsupported within one year of release. Do you see, why I avoid binary-only drivers?
Aren't any problems in 3.3.x technically the problem of the XFree guys and their drivers?
The nv driver and glx module were written by NVidia staff. Don't forget, XFree guys don't know anything about Rivas. They don't have specs.
Yea, but you also have to remember that nowhere on the box did it say "Linux supported." You bought a card that dates back to before Linux was on CNN, and you shouldn't be surprised if it is not supported on Linux.
Huh? The Radeon looks richer in 2D. That's fine. HOwever, the Radeon is barely supported in XFree 4.0, and its 3D performance is limp compared to NVIDIAs. I see the option this way. I could be an OSS bigot (and for some things, it makes sense to be one) and use a slower, less fully featured card, or I can use a card with the best 3D acceleration (if performance counts at all, how is Matrox even an option?), double the 2D speed, and nice, stable (for the most part) drivers.
This is not about OSS bigotism. It is about future support. No company can guarantee future support of their current products ("It is awfully old whatever. You better go buy new supported toy."). Open drivers can guarantee that. And until the product does what it is supposed to do, why buy new?
As a sidenote: abovementioned Riva128 is not in my primary machine. My primary machine uses ATI Mobility M3 and I'm satisfied with it's performance. So when time comes to upgrade that desktop machine (propably this autumn), it will be ATI.
Re:Why Nvidia bought 3dFX (Score:1)
I wouldn't hold my breath waiting for a nvidia Voodoo card either, BTW.
Funk_dat
Buy Matrox or ATI Instead (Score:3)
Re:In other news... (Score:1)
Re:Which geforce 2? (Score:1)
Re:Buy Matrox or ATI Instead (Score:1)
Re:Tweaking Your Cards (Score:1)
Why? (Score:1)
___________
I don't care what it looks like, it WORKS doesn't it!?!
Dear Nvidea... (Score:2)
Re:Buy Matrox or ATI Instead (Score:1)
3dfx (Score:1)
Re:No BeOS driver support from nVidia... (Score:1)
The real interesting part... (Score:2)
The rest was boring "nVidia is getting better" stuff with discussion about memory management (which is no longer news).
In reality they are evolving, not making any breakthroughs.
Attitude Towards BeOS (Score:1)
Re:Attitude Towards BeOS (Score:1)
Sounds Like (Score:1)
Re:Support for *BSD? (Score:1)
Re:Buy Matrox or ATI Instead (Score:1)