Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software

Ask NVIDIA Interview 101

A reader writes: "There's a pretty lengthy interview with NVIDIA, which covers many interesting current topics, including the Xbox, BeOS support, Mac support and the NV20." And they covered more quality control - that's been my major problem with the cards.
This discussion has been archived. No new comments can be posted.

Ask NVIDIA Interview

Comments Filter:
  • I'm the proud owner of a brand new system that sports a 1GHz processor and a GeForce2 vid card. However, I'm getting a disappointing 60fps in my favorite time-killer counterstrike. What's up with that? :-)

    Where can I find the latest and greatest tricks and techniques for tweaking your cards? Thanks.
  • it's an ultra
  • by Andrej Marjan ( 1012 ) on Monday February 12, 2001 @05:00AM (#438739)
    Obviously you don't have an SMP machine. Nvidia's drivers have serious stability problems in that case, but they'll fix them in the next release, for some value of "next".

    And for anyone who runs anything other than (or in addition to) windows and linux, then just about any other card is better, since it will probably work.

    Besides, even assuming the drivers wouldn't crash my machine willy-nilly, I have better things to do than fight my package system to manually graft in these ridiculous drivers into what is otherwise a well- and tightly integrated system.

    As always, it depends on what you have and what you do, but for me, their drivers are not an option.
    --
    Change is inevitable.

  • I haven't checked for updates in a long time

    Check the updates. They've had D3D and OGL support for Unreal 1 for some time.

  • The NVIDIA drivers are the only drivers which support T&L under Linux, and this will likely remain the case for some time.

    This is not true. T&L support for the radeon is on the way.

    The DRI Radeon drivers do not support T&L, and ATI is not releasing the necessary info for the developers to integrate it.

    This is also not true. The specs are under NDA but the DRI developers have them.

  • The NVIDIA drivers fully exploit their hardware - the same cannot be said about any of the open source drivers at the present time.

    And this one deserves special comment, because it's borderline insulting. From dri-devel, MGA400 driver that Gareth has just done some work on:

    Windows 98 - 58fps in Q3A demo001

    Linux/DRI - 72fps in Q3A demo001

    The DRI is already pushing the limits of most of the hardware it supports. Mesa has recently gone through significant re-architecture work to prepare for the limits of future hardware.

  • by bjb ( 3050 )
    I'm happy they had a little blurb about 3dfx, but what I would really like them to do with the 3dfx technology is create a nVidia port of the 3dfx API; more to the point, I can't run Unreal on an nVidia card, but I can through 3dfx's proprietary API.

    Granted, I haven't checked for updates in a long time (its been about a year since I finished Unreal), but it would be cool to play some of those "Voodoo or Software Only" games on another card.

    --

  • If your 12-year old neighbour can afford a $300+ graphics card, then they are likely too busy defending their turf from the other playground crack dealers to be whopping your ass at Q3A.
  • I'm going to be buying a new computer in the very near future and I want to get a good graphics card. I also want to support companies that have been "good" towards open source, so your advice is exactly what I had planned to do.

    But should I buy ATI or Matrox? I don't play many 3D games, but I do want to get decent performance when I do. I also want good 2D because that's what I use most (I want AA text too :) )

    So my question is, which is "better" for me, ATI or Matrox? I appreciate that ATI has faster 3D, but is it as well supported in XFree 4.0.2? How does its 2D compare to Matrox? If the ATI has better 3D and comparable 2D, why did so many people mention Matrox and so few ATI in the recent Ask Slashdot on this topic?

    Help?

    Thanks,
    Stuart.
  • You ACTUALLY believe OSX is right around the corner? What a sad, sad existance you Mac users lead.

    Tell me something. How many times has Apple promised to revolutionize MacOS, and how many times have they shelved the revolutionary version in favor of hacking up the old 1980's technology again?

  • As i understand it, the nVidia code that goes into xfree86 is rendered unreadable before it is submitted.

    I don't want to get into the details of how this violates the xfree86 license, or why you may or may not want to do such a thing. I just want to ask one specific question.

    Now that you've crushed the competition, when might you consider laying off this practice?

  • Yes, right now NVidia support for Linux is OK.
    But will it stay the same? Who knows?

    Why NVidia doesn't support BeOS ? They won't even allow Be to write their own driver!
    I find this VERY disturbing..

    I'm trying currently to choose a new videocard and I think that a Radeon might be a saffer bet.
  • Are you clueless? If it hadn't of been for the openness of the PC architechture, you wouldn't be playing on Linux. Or BSD, or maybe even BeOS.

    You play that "I'm happy with what I got, sorry it doesn't work for you" game and seek sympathy because you want the "Best performance out of my gear." Well, that doesn't work with nVidia because their gear is closed.

    Hell, the GPL grew out of a conflict between RMS and Xerox over printer drivers... What is so different?

    I won't cry if you use Windows to play your games. I'll keep whacking away on Open Source drivers.

    Pan
  • It's more like buying a hummer, but not being allowed to drive it on sundays?! So yeah, sell and get a Matrox G400 (my current favorite)

    Pan
  • Argh. I'm a crack baby. Shoot me.

    Please tell me I'm not the only one that thought this was a slashdot interview...

    --K
  • Since it's been made rather clear the drivers will not go open source, is it possible that *BSD users will see a port of the Detonator drivers?

    It's hard for me to buy a card if my free platform of choice is unsupported.

    --K
  • You ACTUALLY believe OSX is right around the corner?

    Well, apparently Apple thinks so, as you can pre-order it (ships 03/24) in the Apple Store...

    However, since you're obviously extremely brilliant and clueful
    (as demonstrated by your posting), they're probably wrong, and you are likely right.

    --K
  • by Pope Slackman ( 13727 ) on Monday February 12, 2001 @06:50AM (#438754) Homepage Journal
    And from my vantage point, as a BSD user who doesn't play the 'Open Source or Die' game,
    I see it as follows: I can't use NVIDIA cards for 3D. Period.
    I really wish the 'L33n00x !z k3w1' crowd would realize that Linux is not the only free OS out there.

    Carmack has said himself that when the next Doom game comes out in a test release, it will be nVidia only for Linux.

    That doesn't really sound like Carmack. From his postings to slashdot,
    he sounds like he supports interoperability through OpenGL,
    'course it may be that only NVIDIA cards support necessary OpenGL extensions, or it'll be NVIDIA only in just the test release.

    Regardless, my next card will prolly be a Matrox.
    Yeah, the 3D is pokey compared to NVIDIA's, but Matrox 2D quality supposedly can't be beat,
    and the 3D drivers are open.
    If I bought a GEForce, I'd essentially be buying an overpriced, inferior 2D card.

    --K
  • hi

    I belive the Nvidia took <B>alot of IP</B> from 3dfx

    what about the gigapixel IP

    tile based rendering is often better solution such an solution is powerVR

    everything is going towards LOW power LOW bandwidth
    (these are both subjective low compared to now not before)

    often better solution such an solution is powerVR

    everything is going on chip ( SOC )

    expalintion of PowerVR

    PowerVR's unique tile-based rendering approach and on-chip Z and Frame buffer processing drastically reduces memory bandwidth leading to a scalable and significantly lower-cost graphics solution than traditional 3D approaches and enabling new applications for mobile digital devices, set-top-boxes and Internet appliances. PowerVR is uniquely capable of empowering high-performance graphics on consumer devices. PowerVR's patented low-bandwidth architecture is essential to provide high quality digital graphics in affordable consumer electronics solutions. Traditional 3D architectures simply cannot provide comparable graphics processing power at an affordable cost.

    yes its marketing speak but its true

    regards

    john jones

  • My Asus AGP-V7100/2V1D is recognized by Xfree's pci scan as:

    (--) PCI:*(1:0:0) NVidia GeForce2 MX rev 161, Mem @ 0xe0000000/24, 0xd8000000/27

    And it works just dandy!
  • Nvidia needs to publish a list of which Nvidia based cards And Bios support the bandwidth and modes for using the DVI interfaces with the SGI 1600SW.

    Does anyone know if there be a Dual DVI Nvidia DVI card that works with the 1600SW? Quad?

    This screen requires a digital transmitter with lots of bandwidth and some cards with outboard transmitters won't work with it (eg IBM's Riva TNT2 M64 DVI-I which has a Silicon Image 154).

    Cards that will support this screen are: Matrox G400/DVI, Hercules/ Guillemot Geforce1 DDR-DVI (PCI !!!), Geforce2 MXs with an outboard Transmitter, Geforce2 Pros with an outboard transmitter and Geforce2 Ultra AND Nvidia Quadro cards like the SGI V3/VR3. Do not bother with GeForce2 GTS/DVI cards. They will not work. They have an onboard transmitter that only supports 10x7 screen bandwidths.

    Currently I am using the Asus AGP-V7100/2V1D working with the 1600sw and multilink adapter on Mandrake Linux + XFree86 4.0.1 + NVidia's
    binary drivers. It works well except the console looks ugly (in most modes Grub lets me pick). Without using the FBConsole is there any hope for this console support? And it was a bit of a hassel getting the current binaries working in X... But it looks great.
  • And I'm a linux user who does not play the "Open Source Or Die" game who cannot use the NVIDIA drivers. Because I don't have an Intel CPU. If you give me *any* gnu-style .tar.gz, I can *always* install it.

    But the truth is, ATI and Matrox want the Linux market because it isn't as 3d demanding, and they aren't so competitive in that area. NVidia can focus on the 3d performance markets.
    --
  • by BRock97 ( 17460 ) on Monday February 12, 2001 @04:40AM (#438759) Homepage
    We should really be more concerned with the developments of ATI and Matrox. Their 3D drivers are open source and are part of XFree4. NVidia has chosen to ignore DRI and stay closed source.

    Why should I? As a user of Linux who does not play the "Open Source Or Die" game with my hardware and drivers, please give me a good reason as to why I should do this! From my vantage point, I see it as follows:
    • Currently, the nVidia driver is one of the fastest around. Could it be faster if it was open? Sure, I believe that fully, but it is pretty damn fast.
    • As a every day user of Linux who doesn't download the latest Enlightenment or KDE beta or XFree86 release, I can stand to be behind in my releases to keep compatible with my windowing server.
    • Carmack has said himself that when the next Doom game comes out in a test release, it will be nVidia only for Linux.
    • He then goes on to add that he himself will start working on the drivers for the ATI cards to bring them up to speed so it can play his game decently.
    • Sorry, but since this demo is probably a year away, and since JohnC typically knows his s#!t, he believes that nVidia is the best solution right now. From posts here to Slashdot, he seems to know his stuff.
    • I am not saying that he endorses nVidia for their driver practices or anything, this is stuff I have walked away with from things he has said.
    So, as a gamer who would like to see the best performance out of my gear, and basing my current opinion off of things I have read, please convince me otherwise. I believe, though, most users of Linux feel this way and just want their stuff to work.

    Bryan R.
  • As a G400 user, I can say unequivocally that the 2D performance is unparallelled. I have a high-quality Viewsonic monitor (Pf795), and can run it at any resolution and refresh rate it can handle at 32bpp, with a crystal-clear image all the way from 640x480 up to 1800x1440 (the best the monitor will handle).

    It's true the 3D performance really isn't there. The framerates I was getting with Q3 under Win2k were about 20% less than my brother running win98 with a GF256 (SDR). I think the 20% could be attributed to the poor win2k drivers (at that time - can't speek about current drivers for windows). For the money, you could've gotten about 50% greater FPS, but if your priorities are like mine (and it sounds like they are), dualhead, insane resolutions and refresh rates, and pure 2D gorgeousness are considerably more important.

    I'm unfamiliar with the state of ATI's linux drivers, but if they're of good quality, and gaming is somewhat important and dualhead is not, I would recommend a radeon of some sort. Their 2D quality is in the same class as the G400's, and similarly feature-rich (it doesn't hurt that the design is rather newer, either).
  • Quoth...

    Actually, the MX is a piece of shit card. Save yourself some pain and get a Radeon 32 DDR ($87 on pricewatch!)


    Lies. My MX board kicks ass.

    Kagenin
  • The point is that they could release specifications to the parts that are NOT licensed from others, and allow the community to write the drivers...
  • At least he's settled all the rumors about whether or not BeOS will have nVidia support - and they won't have any. These guys are unbelievable.
  • Eugenia is certainly less than a newbie - she's an editor at BeNews, co-founded BeUnited, and dates one of the Be engineers. *But* she has been known to be pretty blunt about stuff in the past and I can understand how this would turn people off.

    Personally I don't see it but I'm probably used to her by now. Either way, their answer was pretty blunt. Overly blunt. Hmmm... nVidia's partnership with Mircrosoft must be wearing off on them.
  • by Tronster ( 25566 ) on Monday February 12, 2001 @07:40AM (#438765) Homepage
    Agreed.
    By the wording of NVidia's answers I have been left with an overwhelming feeling that any answers from developers have been significantly mangled by their marketing and/or PR department.

    Reading the aritcle I was disappointed at how curt they were with answering potential "meaty" developer questions.

    What does NVidia wish to achieve with the interview?
    Generate interest in their products for future purchase.

    Who reads Sharky Extreme?
    Hardcore computer users.

    Do the responses from the interview generate more (buying) interest in Sharky Extreme readers?
    No. I can't speak for all, but I feel Nvidia side-stepped many of the questions and I was un-impressed with the quality of answers.

    I love their products, but find their PR representatives doing them a disservice.
  • Plus, there is a very evil problem with the Geforce 2 MX. If you have an Asus A7V mobo, don't buy this card. I can't do anything OpenGL or DirectX without it freezing the system completely.

    I specifically have the cursed configuration: A7V, 1 Gig Athlon, Creative SB Live! It seems that the raid version of the Asus socket A works fine. I'm going to try updating the bios to the latest version listed, otherwise I return it.:(

    Thanks to tomsharware.com
  • What _I_ want to ask is when we'll start seeing some working DVI support on nVidia-based cards. The GeForce1 works OK with DVI, but at limited resolutions. GeForce2 support is right out broken. They claim to have fixed the problem in the GeForce2MX chip; but no MX boards today ship with a DVI out, and I don't know if there are plans for any.

    ATI and Matrox seem to have it working, why not nVidia? I want a dual output DVI GeForce board! Why is that so hard?
  • nVidia's driver contains intellectual property owned by other companies -- information that they're legally bound not to release. They've posted this info in the past. They'd like to release an open source driver, but starting from scratch isn't very appealing when they already have a driver that works very well.

    ck
  • Regardless, my next card will prolly be a Matrox. Yeah, the 3D is pokey compared to NVIDIA's, but Matrox 2D quality supposedly can't be beat, and the 3D drivers are open. If I bought a GEForce, I'd essentially be buying an overpriced, inferior 2D card.

    For some real life perspective on that, my (former) roomate and I have the exact same monitor (Sony G400 (19" Flat Trinitron)). He has a Matrox G400, and I have a Hercules GeForce2 GTS 64 Mb.

    In 2d, neither of us can tell a difference in image quality, either in windows or linux (no bsd on here yet).

    However, in 3d applications, he finds that sofware rendering is faster for games that aren't supported by Matrox' turbogl. Obviously, that's not cool.

    The rumoured 2d difference is negligible in my experience (if it even exists), and nvidia's 3d power just kicks ass over everything else. GeForce2 MX's are now going for ~$99, so I'd think it silly to get something else.
  • the code is not obfuscated anymore, get over it, Nvidia fucked up when they changed the code right before 3.3.4 was released. the got roasted on it, so they fixed the problem in 3.3.5. the fact is that its just not that fast compared to their inhouse developed drivers.
  • The question was asked, in hope; the question was answered, in PR-speak.

    BeOS gets the finger.

    Dirk

  • oh puh-lease. of course it costs money to build hardware. it costs ford more than that to build a car. does the average person expect ford to weld the car hood shut ? no ? why not ? cos its ridiculous. and as for obfuscating the API -- its an INTERFACE to the hardware. intellectual property doesnt come into it. you can reverse engineer NVIDIAs card with a electron microscope more easily than reading the API to it. they could release a non OpenGL certified open source driver AND a binary driver which is openGL ceritifed. NVIDIA may be a hardware company but the software sells the hardware not the other way around. all anyone needs from NVIDIA is the SPECS not the stupid chip mask.
  • The NVIDIA drivers are the only drivers which support T&L under Linux, and this will likely remain the case for some time.

    This is not true. T&L support for the radeon is on the way.

    I said likely remain the case for some time, not "it will never happen." If work has just begun on the T&L driver, chances are it will take a while. In all likelihood, there will be a new generation of chips out by the time it's stable. If there is a target date, I would be interested in hearing it - "on the way" can mean anything.

    The DRI Radeon drivers do not support T&L, and ATI is not releasing the necessary info for the developers to integrate it.

    This is also not true. The specs are under NDA but the DRI developers have them.

    Then this is a very recent development. Less than a month ago, VA was "in negotiations" to do a T&L driver, and the "details" could not be discussed. It couldn't even be mentioned whether the driver would support 3D textures.

    -Mark
  • When I talk about 'exploiting the hardware', I'm referring to features, not speed. Many of the G400 features are only available if you compile with the binary-only HALlib from Matrox, and AFAIK, environmental bump mapping (supported by the hardware) isn't implemented.

    The only halfway interesting feature of the V5 is FSAA w/ SLI, but that still isn't supported.

    The only truly interesting piece of modern hardware with DRI drivers in progress is the Radeon, but support for this chip is still severely lacking.

    Don't be so sensitive. People are extremely vocal picking out flaws with the NVIDIA drivers. But critisize the DRI and people get all huffy, even if there are valid complaints about the level of support/timeliness of release/difficulty of installation, etc. "Yeah, but they're open source" doesn't cut it when you need to get real work done. Yes, the NVIDIA drivers have some problems, but so do the DRI drivers.

    -Mark
  • Obviously you don't have an SMP machine. Nvidia's drivers have serious stability problems in that case, but they'll fix them in the next release, for some value of "next".

    This is totally false. I have two SMP machines at home, and we've got 3 SMP machines at work that are all using the NVIDIA drivers. They are very fast, and very stable. The NVIDIA drivers are the only drivers which support T&L under Linux, and this will likely remain the case for some time. The DRI Radeon drivers do not support T&L, and ATI is not releasing the necessary info for the developers to integrate it. So, buy a Radeon and you're only going to get the features of a Rage128. The NVIDIA drivers fully exploit their hardware - the same cannot be said about any of the open source drivers at the present time.

    People complain about how 'difficult' it is to install the NVIDIA drivers. If people actually read the install instructions, installing them is trivial. Before you complain about the difficulty of installing the NVIDIA drivers, why don't you try installing the DRI drivers from scratch.

    -Mark
  • NVIDIA is going to run for public office!

    It's a good thing, too, because I'd hate to see those sludge-talk skills go to waste. In response to a few dozen direct, eloquent questions, they let slip the following valuable insights:

    • NVIDIA will consider new technologies as they are released.
    • Q: "How will the XBox graphics chip be different from the pc graphics chip?" A: "... it will be very different in design and Capabilities." (very different, you say? Intriguing!)
    • NVIDIA will rise to meet the challenges presented by its competitors.
    • NVIDIA blah blah focused blah blah concern muh muh muh quality uh uh uh performance la la la value

    In my experience, there are two things you can always count on with this company: (1) that their products will be great, and (2) that anything they say is so full of crap that it's not worth the paper it's not printed on, much less the time needed to read it.

    cheers,
    mike

  • There was an article on daemonnews about this and apparently NVidia has done some work to get the Detonator drivers ported to FreeBSD.

    http://daily.daemonnews.org/view_story.php3?story_ id=1530

    My advice is to sign the petition [m87-blackhole.org] and hope they listen and finish up the work.
  • Actually, the MX is a piece of shit card. Save yourself some pain and get a Radeon 32 DDR ($87 on pricewatch!)
  • What's missing from your post is any sense of reality.

    1.) By relying on a binary-only driver that must run with root privledges, your system can no longer be trusted. You don't know what that driver contains. You don't know if it contains something that could compromise your entire system's security. You don't know if it contains an obscure bug that could bring down your whole system and might never be fixed because there aren't enough eyes probing the code.
    >>>>>>>>
    Good god, I'm not running an NSA server here, just my desktop machine! Second, I'm sure that all the "eyes probing the code" has made GNOME the paragon of stability that it is (tongue in cheek)

    2.) Any company that refuses to open source their hardware drivers clearly does not understand and support the Open Source movement. Such companies, after this much time, are unlikely to change. To use their products is to be forever stuck with a proprietary solution. And what happens when the company phases out driver development for older products? You are now stuck with a binary driver that ONLY works with a specific, outdated Open Source version. Lets say, hypothetically, that tomorrow NVidia stopped developing the GeForce drivers for XF86. Would you be satisfied running XF 4.0.2 for the rest of your video card's useful life?
    >>>>>>>>>>
    That's funny, specifically when said in reference to NVIDIA. NVIDIA is still providing driver updates for the Riva128. The card is three years old. If you're stilling using hardware that old, you have no right to complain about lack of software support. Also, Win98 runs perfectly well with the 5 to 6 year old Rage II drivers, so if XFree86 5.0 isn't compatible, blame XFree, not NVIDIA. (Not to mention Linus and his driver API of the day games)

    3.) To use an old saying, "Slow and steady wins the race.." Sure a closed source driver may offer an adequate solution *right now*, but an open source driver will inevitably surpass the closed one in quality in the near future.
    >>>>>>>>>>
    Which is exactly why GNOME totally whips NT4's ass in GUI speed. Not. Face it, OSS isn't nearly of the panacea of software development that its cracked up to be. Properly done, OSS can be a big boost for a software project. It just doesn't do miracles, that's all.

    That is an overview for all hardware drivers. Now what about NVidia vs. ATI/Matrox? Consider that ATI and Matrox cards are generally accepted as having higher quality RAMDAC's which lead to better 2D image quality (cleaner analog signal). Furthermore, I believe the Radeon DDR bests the GeForce2 GTS in 32-bit at high resolutions by a significant margin.
    >>>>>>>>>>>
    Uh, no. Where do you get your info? While its true that both the Radeon and Matrox cards do *look* better, the GF2GTS is more than 10-15% faster than a Radeon 64DDR in most games (even at high res.) For a while there, the Radeon beat The GTS in Quake III at 16kx12k, but after the Detonator 3 drivers, NVIDIA came up big time.
  • Actually, according to the recent Anandtech benchmark, NVIDIA's 2D is about double the speed of Matrox's (better drivers!) While the speed might have improved in the latest alpha-CVS-snapshot-8:00am build, I doubt it catches up. Also, the Radeon is only barely supported (if you can call it even that) in XFre86 4.0.2 It will give you great 2D quality in Windows though ;)
  • A) My guess is that NVIDIA will support the NV20 on OS X as well. (I think they've publically commited to it.)
    B) AGP 8x? Please! There was a 3 year gap between the release of AGP 2X (the LX chipset) and AGP 4X (the 820 chipset) AGP 8X is still a few years away!
  • Umm, the 7.xx drivers blow (20% less performance) Avoid them until they stabilize a bit.
  • Last W95 driver is from February 1999 (It is W95 driver, not WDM driver!).
    Last feature update for XFree was in 3.3.something, when NVidia switched to new architecture for their open drivers.
    >>>>>>>>
    Whoops, my mistake. Sorry, the BeOS Unified NVIDIA Drivers support the Riva128 through GeForce2 GTS. I just assumed Detonator 3 did as well. Still, the TNT-1 is an aweful old chip to still see driver updates...

    Riva128 under X still can't calculate timings correctly - my GTF calculated Modeline for 1280x1024@85Hz doesn't work - it gives 80Hz. With some tweaking I was able to get 81Hz - still far from 85Hz. Riva128 driver does not support XVideo extension - altrough hardware is capable of doing so.
    The glx module for 3.3.3/3.3.5 is a joke - I was not even able to run gl-screensaver with it - it crashed whole X (that's my whole need for 3D - screensavers and occasionaly games
    like chromium).
    >>>>>>>>>
    Aren't any problems in 3.3.x technically the problem of the XFree guys and their drivers?

    BTW., when I bought the card, nowhere on the box was written "You can use this card only until Feb, 1999, later you must buy new one". The card still works, the chip has functions I need, but I just can't use it.
    >>>>>>>>
    Yea, but you also have to remember that nowhere on the box did it say "Linux supported." You bought a card that dates back to before Linux was on CNN, and you shouldn't be surprised if it is not supported on Linux.

    And finally, if the specs were available, you could still use your favourite OS in years to come(you are Be fan, right?).
    >>>>>>>>>>
    For BeOS, I'm switching to Radeon II. I'm pissed at NVIDIA for not giving Be the specs under NDA, but I can understand their reasons for doing so. I can get as mad as I want at Linus for not tuning Linux for media performance only, but he doesn't want to for his own reasons, I have to respect it and use something else.

    So I can choose card that looks better, supports XV extension and performs decently in 3D (Matrox, ATI), or card that offers only 2D with open drivers without any extensions (GF2xx). Guess what will NOT be my next card.
    >>>>>>>>>>
    Huh? The Radeon looks richer in 2D. That's fine. HOwever, the Radeon is barely supported in XFree 4.0, and its 3D performance is limp compared to NVIDIAs. I see the option this way. I could be an OSS bigot (and for some things, it makes sense to be one) and use a slower, less fully featured card, or I can use a card with the best 3D acceleration (if performance counts at all, how is Matrox even an option?), double the 2D speed, and nice, stable (for the most part) drivers.
  • I agree, but calling someone "dickheads!" isn't probably the way to get better BeOS support.

    A polite petition would probably work better.

    *shrugs*
  • Has RaMbUs sued you yet?

    Followup: If so, do they want the death of Nvidia or just your firstborn sons?
  • I look at it from this standpoint: Free/OSS software != Free/OSS hardware. Not even by paradigm.

    1.) In the vast majority of the consumer world (Meaning the Windows, Macintosh, and UNIX), not only are the drivers closed-source, but this is an acceptable practice. Drivers are provided free of charge (gratis); drivers aren't a source of income.

    2.) To have a truly 'Open' 3D Driver of high performance, it is necessary to include the 3D code, or at least the hardware interface to the Graphics API (in the case of Linux, Mesa, or OpenGL). nVIDIA chose to use certified OpenGL, not Mesa. I'm not entirely current on Mesa's status; nevertheless, OpenGL is seen as a better option to many buyers, as it is a guaranteed "OpenGL ARB"-compliant implementation, rather than a non-"OpenGL ARB" certified implementation of OpenGL. A certified OpenGL implementation cannot, I believe, be released as 'free' software.

    3.) Having the hardware interface for their card published (as Open-Source) makes it easier to reverse-engineer the hardware. ATI wouldn't care about doing so, but about every other manufacturer would, as even Matrox's 3D is lacking. This is essentially a 'security-through obscurity' scheme, but in the hardware world, it's all you have.

    4.) nVIDIA is a *HARDWARE* company. R&D on hardware isn't anywhere near the curve that it is on software. It's much more difficult to design a chip than write code. Chip fab is so obscenely expensive that it isn't really possible to have the Free-Software equivalent of hardware. Unlike software, the machines & tooling to fab hardware can't be duplicated on a whim, and they are extremely expensive. The source materials also cost money (although trivial after machinery & tooling costs). The tooling to make the chip is at least $250k... forget energy, materials, labor, environmental regulations, etc. costs accumulated in physical production. So if there are any bugs, it costs millions.

    Too many people don't realize the difference between hardware and software when it comes to Intellectual Property. The software business gets many rights that hardware makers don't; and they have far fewer of the penalties.

    Free software takes only a compiler, time, and a coder to generate a useful, reliable, high-quality product.

    You *CAN'T* do that with hardware. But, we'll suppose you have a Free Hardware design.

    Now 'compile' it from the design code to a useful, reliable, high-quality chunk of silicon.

    ... For under $250,000.00 US.

    Free software exists because it's inexpensive enough to be a hobby... a passion. All it takes is time and effort.

    Free hardware can't say that. Only Billionaires have the resources to create free hardware.

    So give nVIDIA some credit and a chance to get a return on their investment in hardware and tooling to create the chip. The only ones who are really able to take advantage of an open-source driver are the other hardware manufacturers, who would use it to reverse-engineer the hardware, or the users of an OS without a driver. Since Windows, Mac, and Linux covers all but the smallest part of the OS world that uses 3D, and he *BSD clones are used (and advocated) primarily as a server that doesn't even need X... not a graphics workstation or gaming station, it shouldn't even be an issue to anyone... except for QNX and BeOS users.
  • ... in the latest linux and windows drivers? It such a shame to see something so common mercilessly turned into a weapon to hang one's machine. I guess you guys don't want to turn your monitors off.
  • by mauddib~ ( 126018 ) on Monday February 12, 2001 @06:15AM (#438788) Homepage
    First of all: I'm not a Linux user, but a FreeBSD user, who is very angry at nVidea at the moment. Let me tell you a little story behind the development of our cute little nVidea Riva128 chipset drivers:

    1998 computer bought, only drivers available for Windows NT
    1999 Drivers released for Windows 95/98, shortly followed by _very_ buggy Linux drivers.

    Right now, nVidea stopped their development for the riva128 chipset on Linux (which means it "probably doesn't work"). No support for any videocard under FreeBSD or any other OS besides Linux/NT/98/95. No specs opened, many developers who are *willing* to introduce this chipset only if they had the specs.

    Results: when XFree86 developers or Linux kernel developers are willing to change their implementation the nVidea drivers are likely to be incompatible.

    And no, I'm not a gamer, I'm just asking for OpenGL hardware acceleration on my system. Tell me why I should stay with nVidea?
  • There are VERY good reasons why you should play the so called "Open Source or Die" game with hardware drivers.

    1.) By relying on a binary-only driver that must run with root privledges, your system can no longer be trusted. You don't know what that driver contains. You don't know if it contains something that could compromise your entire system's security. You don't know if it contains an obscure bug that could bring down your whole system and might never be fixed because there aren't enough eyes probing the code.

    2.) Any company that refuses to open source their hardware drivers clearly does not understand and support the Open Source movement. Such companies, after this much time, are unlikely to change. To use their products is to be forever stuck with a proprietary solution. And what happens when the company phases out driver development for older products? You are now stuck with a binary driver that ONLY works with a specific, outdated Open Source version. Lets say, hypothetically, that tomorrow NVidia stopped developing the GeForce drivers for XF86. Would you be satisfied running XF 4.0.2 for the rest of your video card's useful life?

    3.) To use an old saying, "Slow and steady wins the race.." Sure a closed source driver may offer an adequate solution *right now*, but an open source driver will inevitably surpass the closed one in quality in the near future.

    That is an overview for all hardware drivers. Now what about NVidia vs. ATI/Matrox? Consider that ATI and Matrox cards are generally accepted as having higher quality RAMDAC's which lead to better 2D image quality (cleaner analog signal). Furthermore, I believe the Radeon DDR bests the GeForce2 GTS in 32-bit at high resolutions by a significant margin.

    Just my $0.02. Please also read Eric Raymond's "The Magic Cauldron." (Especially the last section about open source drivers)
  • need to be addressed before anyone should even bother putting an NV20 in a G4... such as the 4x AGP which Apple has just -now- come out with. 6x/8x AGP will be the standard by the release of NV20. Driver issues are also major. I can't belive nVidia and Apple even BOTHERED making drivers for the GeForce 2 MX for Mac OS 9 when OS X is right around the corner. All Mac software development should be towards AltiVec (PowerPC "G4" 7400/7410/7450) optimized, native Mac OS X code. Wasting time with Mac OS Classic (Mac OS 9 and the such) or even Carbon is just that, a waste of time. *sigh* Study up on the OS X IOkit, core gfx, core audio, and Cocoa and forget the cruft of the past.
  • If it's a geforce2 mx, that sounds about right, it doesn't have the ram thruput or the fillrate for much higher than 60 fps at a decent resolution and normal texturing. If you have the horsepower of a 1 GHz system you really outta match it up with something with a better raster engine: GeForce 2 GTS, Pro, Ultra. Or, if you're doing professional gfx, consider a Wildcat or E&S. Also, any machine with a 1 or 1.5 GHz CPU really outta be decked out all around to feed that bad boy, otherwise you're better off with a 700 - 800 MHz CPU. Consider getting a board with dual channel PC800 Rambus or DDR-SDRAM "PC2100". Some good fast drives too, disk is the slowest thing on a system short of network.
  • Heh, good point! At least FinalCutPro continues to work. As far as "old 1980's technology", I know you're talking about classic Mac OS, wherein most of the real development took place between 1979 and 1984, but its kinda funny how Mac OS X (basicly OpenStep 5.0) hasn't changed much since the NeXT Cube and NeXTSTEP were released in 1988!
  • >>past about 40-50, your eyes can't even tell the difference anymore.

    True, but only if the framerate is sustained (say, from a movie projector, or high-end graphics gear -- SGI or E&S). Personally I have my gaming rig set up with vsync on, it rarely dips below my refresh rate (72 Hz) locked on solid. Plus with vsync enabled, I don't get the screen "tearing" that comes with ultra-high framerates and lots of action.
  • I'm waiting for the kernel hackers to make linux work better on the X-Box

    :^)

    Will Microsoft allow game developers to use Linux?

  • Ahhh, so I wasn't the only one reading that article with Nvidia using the 'borg voice'
  • In the article NVIDIA are asked when they will incorporate 3DFX technology into their cards and each time they reply with "not sure"

    Well why the hell did they buy 3DFX then? Was it just to take out a competitor?

  • Besides it doesn't do AGP on the Via-Apollo chipset, the 0.9-6 drivers are quite okay on my SMP-box at work.
    But, at home, where I'm in control of what goes inside my computer, I prefer OSS enabled hardware and use Matrox.
  • What is to become now of the 3dfx opensource effort, given nVidia's anti-opensource leanings? The linux.3dfx.com site is gone, and the placeholder 3dfx.com site explicitly lists nothing for linux drivers.

    What has become of the existing code? I know some of it has been merged into the XFree86 trees, but the rest?

  • unless their crack dealin' daddy bought it fo dem
  • a geforce2 mx is pretty limited by memory bandwidth, so its not surprising that a DDR Radeon beats it. i bet your Radeon was a lot more expensive than your 32MB GeForce MX too...
  • by oingoboingo ( 179159 ) on Monday February 12, 2001 @04:03AM (#438801)
    Matrox couldn't really be considered to be on par with ATI and nVidia...their G400/G450 design isn't looking so flash compared to the others...open source drivers can buy you only so many warm-fuzzies, when you're getting the FPS crap smacked out of you by your 12 year old next-door neighbour's closed-source driver 64MB GeForce2 Ultra.
  • I just recently upgraded to a similar configuration, an A7V, 1 Gig Athlon, SB Live!, and an Asus 7700 with a Geforce 2 GTS and 64 MB of DDR RAM. It would freeze up on my quite often when running Direct X programs. I tried the latest Detonator driver from Nvidia and quite a few other things. Finally, I got a BIOS update from Asus and that fixed the problems. I think it was rev. 1050D.
  • Wow, I hate to say that, but that was an amazing waste of an interview. They basically admitted nothing (as nakaduct said), except that they like seafood and are not releasing BeOS drivers. The rest of it was a wash. Nothing about where they're going or what they're doing. Useless.

    Hell, I'd at least expect them to say something about when the NV 20 is due.
  • Then you either got a really, really fucking cheap Radeon, or you ridiculously overpaid for your MX. The card from nVidia in the same league as your Radeon is the full GeForce2 GTS, of which the MX is a crippled, cheap version. Comparing performance to your Radeon is completely unfair.
  • But they're *not* in the same league in terms of price. Your case is the exception, not the rule, and frankly I'm still wondering whether or not to believe you got a very expensive card for $50 more than the price of a very cheap card. Go check out some hardware vendors prices on the Radeon and a GeForce2 MX and compare them.
  • you wouldn't expect the "Mako Shark...mmmmm" at the end of the article. Cannibalising one's own namesake, eh?


    47.5% Slashdot Pure(52.5% Corrupt)

  • I have a GeForce2 MX with 32 megs of RAM. I also have an ATI Radeon 64 Meg DDR. Even under linux, the Radeon beats the GeForce, both in terms of performace (with Q3A) and quality, IMHO.

    Now all we need is for the DRI drivers for the Radeon to use the T&L unit on the card.

    Ranessin

  • Not all that more expensive. I think it was maybe $50 more.

    Raneesin

  • nVidia's drivers are fine as long as you don't have a problem. If you do, you're screwed.

    With the DRI drivers, if you have a problem or uncover a bug, just ask the DRI developers and it's usually fixed in a timely fashion.

    Ranessin

  • Where did I say anything about the cause of the performance difference? Hell, I didn't even mention the phrase "open source" anywhere in my post. Yep folks, we've got a real genious here.

    Ranessin

  • The code for nVidia drivers in XFree86 is *not* obfuscated. At one point it was, but now it isn't. As I recall, the obfuscated code was replaced in 3.3.5.

    Now, if you're talking about the linux drivers they developed in house: they're not obfuscated, their closed source.

    Ranessin

  • Frankly, cards should be compared, performance wise, if their price is in the same range, shouldn't they? What do I care if the MX and Radeon are two different classes if they cost me approximately the same.

    I got ripped off for getting a great price on two cards? Can I have some of what you're smoking?

    Ranessin

  • As a consumer, I could give a shit if the cards I compared are in the same league in terms of performance. What I care about is if they're in the same league in terms of price. They were (and yes, I got the Radeon for a good price).

    Ranessin

  • All one has to do is some research and they can find a *very* good price for a Radeon 64 meg DDR.

    Ranessin

  • It's the normal state of things if you know how to shop, moron.

    Ranessin
  • Someone: Will you include technology X in future cards? What do you think of competitor X? What about product X?

    NVIDIA: NVIDIA is the world leader in graphics solutions.

    --
  • Well, at least for the important stuff anyway. Open source NVIDIA drivers don't exsist for a good reason. It's not because they are greedy bastards and its not because the open drivers will reveal patent infringement (as i once thought..), its because some technology is licenced from other companies/people/whatever and they are bound to not reveal stuff by contract. They just don't have a choice.

    --
  • That's funny, specifically when said in reference to NVIDIA. NVIDIA is still providing driver updates for the Riva128. The card is three years old. If you're stilling using hardware that old, you have no right to complain about lack of software support. Also, Win98 runs perfectly well with the 5 to 6 year old Rage II drivers, so if XFree86 5.0 isn't compatible, blame XFree, not NVIDIA. (Not to mention Linus and his driver API of the day games)

    Could you give the URL? As an owner of the original Riva128, I'm really interested in driver update.

    Last W95 driver is from February 1999 (It is W95 driver, not WDM driver!).

    Last feature update for XFree was in 3.3.something, when NVidia switched to new architecture for their open drivers. Riva128 under X still can't calculate timings correctly - my GTF calculated Modeline for 1280x1024@85Hz doesn't work - it gives 80Hz. With some tweaking I was able to get 81Hz - still far from 85Hz. Riva128 driver does not support XVideo extension - altrough hardware is capable of doing so.

    The glx module for 3.3.3/3.3.5 is a joke - I was not even able to run gl-screensaver with it - it crashed whole X (that's my whole need for 3D - screensavers and occasionaly games like chromium).

    BTW., when I bought the card, nowhere on the box was written "You can use this card only until Feb, 1999, later you must buy new one". The card still works, the chip has functions I need, but I just can't use it.

    And finally, if the specs were available, you could still use your favourite OS in years to come(you are Be fan, right?).

    Uh, no. Where do you get your info? While its true that both the Radeon and Matrox cards do *look* better, the GF2GTS is more than 10-15% faster than a Radeon 64DDR in most games (even at high res.) For a while there, the Radeon beat The GTS in Quake III at 16kx12k, but after the Detonator 3 drivers, NVIDIA came up big time.

    So I can choose card that looks better, supports XV extension and performs decently in 3D (Matrox, ATI), or card that offers only 2D with open drivers without any extensions (GF2xx). Guess what will NOT be my next card.

  • the BeOS Unified NVIDIA Drivers support the Riva128 through GeForce2 GTS

    Do you realize that BeOS Unified NV Driver is port of XFree driver?

    Still, the TNT-1 is an aweful old chip to still see driver updates...

    When introducing Riva series, they (NVidia) claimed that they designed hardware architecture that allowed them to be forward compatible, so you could use new drivers for old card and vice-versa without any problems.

    Also, NVidia does not need to make updates for their older cards - they will certainly not make it forever. However, they should publish the specs for people, that are interested/able to self-support their own hardware. See for example GUS support in Linux (GUS = Gravis Ultrasound, sound card that was produced until 1994. It is good soundcard even for today's requirements and it is still supported in Linux. This is possible because Gravis published specifications). And it is certainly older than TNT :-). On the other hand, see HP and some of their scanners under W2K - totally unsupported within one year of release. Do you see, why I avoid binary-only drivers?

    Aren't any problems in 3.3.x technically the problem of the XFree guys and their drivers?

    The nv driver and glx module were written by NVidia staff. Don't forget, XFree guys don't know anything about Rivas. They don't have specs.

    Yea, but you also have to remember that nowhere on the box did it say "Linux supported." You bought a card that dates back to before Linux was on CNN, and you shouldn't be surprised if it is not supported on Linux.

    • The card is not supported even under Windows, while it was on the box.
    • In the same time, NVidia promised support for Linux. And then began obfuscating sources. Today, we have no source.

    Huh? The Radeon looks richer in 2D. That's fine. HOwever, the Radeon is barely supported in XFree 4.0, and its 3D performance is limp compared to NVIDIAs. I see the option this way. I could be an OSS bigot (and for some things, it makes sense to be one) and use a slower, less fully featured card, or I can use a card with the best 3D acceleration (if performance counts at all, how is Matrox even an option?), double the 2D speed, and nice, stable (for the most part) drivers.

    This is not about OSS bigotism. It is about future support. No company can guarantee future support of their current products ("It is awfully old whatever. You better go buy new supported toy."). Open drivers can guarantee that. And until the product does what it is supposed to do, why buy new?
    As a sidenote: abovementioned Riva128 is not in my primary machine. My primary machine uses ATI Mobility M3 and I'm satisfied with it's performance. So when time comes to upgrade that desktop machine (propably this autumn), it will be ATI.

  • It was because it was cheaper to buy the entire company than to have to pay 3dfx a bunch of money because of patent infringement. The fact that they took out a competitor was just icing on the cake.
    I wouldn't hold my breath waiting for a nvidia Voodoo card either, BTW.

    Funk_dat
  • by mojo-raisin ( 223411 ) on Monday February 12, 2001 @03:52AM (#438821)
    We should really be more concerned with the developments of ATI and Matrox. Their 3D drivers are open source and are part of XFree4. NVidia has chosen to ignore DRI and stay closed source.
  • It's typical NVIDIA behavior. I'd love to work for their PR depertment, because they consider it "work" to answer every question with some variant of "I can't say". I wonder if any of them are ex-NSA employees...
  • Also, go to www.3dchipset.com and download the latest 7.xx drivers. There's another site somewhere that carries the same drivers with benchmarks, but that url is at home. I do know that the ones previous to 7.17 seemed to be more framerate friendly, but the 7.17's have been the best for me under Win2k & ME both.
  • Please check the nvidia site before screaming your ass of, the source is available, as an xserver binary+ source. at http://www.nvidia.com/Products.nsf/htmlmedia/deton ator3.html, go to driver library.
  • What is wrong with 60 fps. Movies are 25 fps. 30 fps is smooth gameplay, past about 40-50, your eyes can't even tell the difference anymore.

  • Why is this interview pertinent? Dear god the first question NVDIA responds with corporate bullshit, and vague talk of the topic as if the NVDIA interviewee had no clue what it was. Humus then goes on to ask really great questions involving implementation of specific features, etc and NVDIA practially shoots him down with a generic bullshit answer like "We are always interested in supporting wide spread accepted APIs" Not a good read. Don't read it.

    ___________
    I don't care what it looks like, it WORKS doesn't it!?!
  • GraphicsNerd asks: I have a very specific question about future capabilities of Nvidia hardware that will seriously affect the methodology and outcome of my current project. I need to know about the 2D, 3D, video, and multimedia reference specifications for scene graph management of dependent texture reads and DOT 3 bumpmapping for my PC-based manufacturing, science, e-business, entertainment, and education application.
    NVIDIA: This will be possible in future NVIDIA implementations in a completely generalized fashion, using Open GL and Direct3D. Although we will not participate in API innovation, we will support any quality API that comes along and is handed to us. Also, we will incoporate any graphics technology containing the strings "rad", "direct", and/or "3D". Although your project will be required to specifically support our future products, we cannot comment on any specific future products. NVIDIA designs products to meet spec, so any hardware incompatibilities are the motherboard manufacturers' fault. Previously, NVIDIA's filter design circuit methodology led to low quality RAMDAC filter use by our partners. In the future, NVIDIA will implement your wildest dreams in a completely generalized fashion, using a sophisticated cross-bar architecture.
  • Well, I have a 1980 GMC 1500 that will kick the shit out of a vette.. Because it has.

  • How about asking Nvidia what happened to my $200 of 3dfx stock? :-)
  • Closed drivers suck for FreeBSD too.. My TNT2 is my first and last purchase from nvidia.
  • ... is that there is no news in the interview. About the only definitive answer was that BeOS is not going to have nVidia as a pal.

    The rest was boring "nVidia is getting better" stuff with discussion about memory management (which is no longer news).

    In reality they are evolving, not making any breakthroughs.

  • Personally, i'm rather offended in the way which NVIDIA handled the one and only question regarding support for the Be platform. Being a BeOS user myself, I can attest to the frustration of those users who are missing out on the highend capabilities of a GeForce2 just because of their choice of OS... it's not so terribly bad in itself, after all there ARE other vid cards to be had that do support Be (my Diamond Monster Fusion still works great) but its the inherent attitude that I sensed in the article. Starting with heading it up under "Miscellaneous"? Im sorry, but thats just disrespectful.. especially when you look at some of the other categories that were given their own headers. Another note: i dont really appreciate the attitude put forth by the question itself and I believe that this may have had a great deal to do with NVIDIA's blunt "NO" response... If the question had been asked with a little more tact and a little less newbie, we might have gotten some useful information as to WHY they were saying NO, but know who can guess? There's my little rant, feel free to reply, flamers dont waste my time. Thnx Syn
  • I meant no disrespect and/or malice towards the author of the question, merely the way in which it was stated. Hope this clarifies things. Syn
  • your vsync hasn't been disabled. I get those frame rates with my MX on a 1ghz T-bird
  • NVidia provides sources to the linux _kernel interfaces_ needed to hook the driver up to the operating system. The code to drive the NVidia cards' hardware however is distributed in a precompiled binary. It's therefore conceivable that somebody might go through the pain and effort of hooking the precompiled binary up to the *BSD operating systems. Since to my knowledge there exist no such kernel interface hookups for any of the *BSD right now I suggest you try another board supported on your platform. If you go out and buy a NVidia card right now it'll just sit there and gather dust until somebody finally gets it to run on your platform. When the time finally comes around and you can actually use the card on your platform there will be far more advanced boards available from other manufacturers which are documented and have opensource drivers. You, however will be stuck with an undocumented, overpriced, outmoded and underpowered board. __ (c) 2001 Libel & Slander & Associates Inc.
  • That's kind of like saying, "I have a Jeep Cherokee and a Corvette. I raced them and the 'Vette surprisingly kicked the shit out of the Jeep. Must be because Chevy's engines are open sourced."

This is now. Later is later.

Working...