Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Software

NVidia + OpenGL + Linux 119

BJH writes "Saw this on Ars Technica - NVidia have announced their new workstation-class graphics board, and say that it's going to have OpenGL drivers for Linux. Check it out their press release for more information. " The hardware looks really, really nice too.
This discussion has been archived. No new comments can be posted.

NVidia + OpenGL + Linux

Comments Filter:
  • by the eric conspiracy ( 20178 ) on Friday November 05, 1999 @04:22AM (#1561526)
    Well, I can tell you some of the things that I have used or seen high-end graphics used for.

    1. Visualization of the 3d structure of molecules. Many chemical reactions require an understanding of the fit between say a molecule and a zeolite in 3d. In order to visualize this in 3d I used an evans-and-sutherland graphics workstation with a mechanical shutter and jittering display image to project a 3d image into the space in front of my eyes. This type of application is big big big in the pharmacueticals industry. SGI has a very strong market share here.

    2. Visualization of CFD simulations. Real-world work often requires a multi-dimensional projection of data onto a 2-d surface of large data sets - say flow fields obtained from computational fluid dynamics. Ideally you would like the ability to view the 3d time dependent result and rotate or pan the 3d field in real time. Most of the CFD work I have seen is done on HP or Sun workstations these days. Important in all sorts of places - example - modelling flows in an oil field, or in a tornado.

    3. CAD/CAM. Computer aided design on a large scale. My brother is a wing designer for Boeing on the Joint Strike Fighter project. Boeing is doing all their airframe design in the digital domain now. This means preparing 3d models showing the actaul placement of every component in the airframe and determining it's mechanical performance.

    Obviously this is important stuff - it's where the action is in the transferrence of science to every day life. I suppose the NVidia card may fit in the low end of some of these applications.

  • ...the companies that fork out big bucks for this sort of high-end hardware aren't looking to play Q3.

    Well, unless they are id Software :-) which uses Intergraph (among other) hardware for development. Quadro is placed to compete with the likes of Intergraph and SGI. Wait a moment, wasn't NVIDIA working with SGI?!? Maybe we will see SGI Quadro Reality or something...

    -
  • Ok - workstation-class graphics is what you're using when you're doing real-time intensive CAD/CAM and graphics modeling - you can do some of that stuff on your consumer card, but the really-photorealistic-realtime stuff requires a more expensive card. Basically, this is nVidia's first entry into this market. I suggest you pick up a copy of Computer Graphics World to find out more - it's actually quite interesting. Also, see the E&S ad in there - they differentiate quite nicely between 'cards for play' and 'cards for work'. It's my worst nightmare to have to do intensive 3d stuff and be stuck with a consumer card. For games, OK, but if you want the real tomato, the workstation-class stuff is what you want.
  • FYI... Although the press release mentions Linux as a supported OS, I don't see it listed in the preliminary specs on Elsa's page [elsa.com].

    Hopefully this is an oversight or I just missed it.

  • considering that the G400 was out on the shelves about 2 months before the public announcement of the PowerPC G$, I'd say that the situation might be the other way around... but I doubt it.

  • That is an interesting concept. I don't want anybody that is 'not free' for a wife. Did you buy yours?

    JM
  • If I recall, SGI was sponsering Debian or something.

    If SGI is sponsering it, they want a kick-ass graphics card for serious workstations.

    In addition, SGI allied [slashdot.org] with Nvidia.
  • Actually, no. That title belonged to S3. Granted, the Diamond Edge cards (Which were based on the NV1) sucked, but for its day they were pretty good. Certainly not slower than software. The "3D" at that time was NV1, Matrox Millenium, and Virge. 3Dfx had not yet shown its face. (BTW I have lost all respect for 3Dfx (sorry 3dfx) due to its recent moves. Long live nVidia!)
  • Actually you should learn before you speak. Winmodems do not have 16550s no modems do. That is on the Motherboard chipset. Winmodems however, do not have the modem controller on board. (Actually some of the latest winmodems have the controller on board but use the CPU to do compression, error control, etc.)
  • Actually you should learn before you speak. Winmodems do not have 16550s no modems do. That is on the Motherboard chipset. Winmodems however, do not have the modem controller on board. (Actually some of the latest winmodems have the controller on board but use the CPU to do compression, error control, etc.)But true Winmodems do suck. I have one. Should have paid the $20 extra when I was ordering my computer. Dammit, just one little dropdown box away from being able to use my Modem under BeOS. But I didn't understand the concept of software modem. Doh. And on top of that Bell Atlantic is taking forever to get DSL out. Its been in my neighborhood from a month and I still can't get it. I'll stop bitching now.
  • People will continue to use NT and its derevitave until Linux performs better. 3D people don't care about stability, NT can easily stay up the few days it take to render most movies (and if it takes more you are probably offloading rendering to an SGI box anyway.) NT and its whole kernel graphics paradigm just performs better.
  • Not necessarily. The GeForce still has nowhere near the transform power of a REAL high end card. Plus high end modelers need things like anti-aliasing, etc which the GeForce does not have. As nVidia's something Kirk put it, "you expect a workstation to have anti-aliasing, it a given" or something along those lines.
  • by Brent Nordquist ( 11533 ) <`bjnord' `at' `gmail.com'> on Friday November 05, 1999 @02:13AM (#1561548) Homepage
    Chad Miller (founder of the "RIVA Enlightenment Project") has a "linux-nvidia" mailing list; details on this page [chad.org].

    There have been a lot of posts so far on whether nVidia's code is open-source. nVidia created a hardware-enabled GLX driver that integrates with XFree86 3.3.x, and source is available (you can compile it yourself). However, the source is obfuscated to protect what they consider proprietary details about their cards.

    XFree86 4 will be the thing to watch for GLX with integrated 3D hardware support; it looks to me like this is where nVidia is putting a lot of effort. Should be sweet!
    --

  • Yea, I know what you are saying. My TNT2 may smoke my G200 in speed - but NVIDIA Trilinear Filtering is more like Trilinear Noise Addition(TM)... and the colors from the G200 allways seemed more vivid.

    However, note that the Quadro is a separate silicon die from the GeForce. To begin with, I think (but I'm not sure) NVIDIA fixed some quality problems when they made the GeForce. They could (and should) have done even more for the Quadro.

    Not like any of us mere mortals will ever get to see one in action... ;-)

    -
  • Actually the GeForce has a lower clockspeed than the TNT2 Ultra.

    Indeed, and by quite a bit. BUT, it also has four pixel pipelines, where the TNT* has two. And yes, these are used even if the scene is not quad-textured - they can render multiple pixels at once.

    But, as you say, the real big deal with NV10 is the transform engine.


    -
  • why would anyone ever want their cpu time to be spend doing the hardware's work? my guess is that people simply do mot realize that this is what winmodems do. maybe someday people will realize this, and they will buy hardware which actually does something, rather than pushing it onto the cpu. its like having a video card that uses the main cpu to render graphics.



    "The importance of using technology in the right way has never been more clear." [microsoft.com]
  • Very true. SGI owns the source of the reference implementation ('driver'). I can't seem to remember any announcements from SGI saying that they were going to open that souce. They get money from licencing that source to hardware makers. But then again, who knows, SGI has been releasing some other cool stuff lately. (Or promising to release...)

    There is allways Mesa... and any OpenGL system, be it closed or open, is better that Direct3D. (Noooo, not like anybody on this site would agree with me on that one :-)

    -
  • Don't agree. The glx-driver found on their ftp site performs very nicely with q3test (and my PII 450 :) and has not caused a single problem during the 2 months I've used it. /jarek
  • Even though they conspicuously didn't mention, Quadro is a member of the GeForce / TNT family, so existing GeForce / TNT drivers should WORK, just not as well as Quadro drivers. . .
  • Are they releasing the specs/source as well so that drivers may be written for other platforms? I would *really* like to see BeOS drivers for this. It seems like the kind of hardware BeOS was made to run.

    --
    grappler
  • pixel fairy sez
    > overlay planes are needed by many high end apps,
    > and if these cards dont support overlay they may
    > not get far

    Overlay planes are not really essential if you can grab the image and display it fast enough. I know that its a waste of memory and processor cycles and bandwidth, but for interactive applications you have those cycles and bandwidth to burn.

    The nice thing about throwing out the overlay-plane hack (perhaps you can detect some bias here) is that you can do much better rendering of the foreground elements that you are interacting with. Overlay planes were typically used to draw things over static backgrounds, and were limited to just a few bits. If you just load the whole background image in every frame, then you can draw nice antialised, colored, even shadowed lines and objects over the background, and get a much richer interactive experience.

    I've ported a few of my SGI-based visual effects tools to Linux, and had to give up on overlay planes, and while it was difficult at first -- I don't miss them any more. And this is using extremely slow refreshes; once there is good hardware accelerating for OpenGL glDrawPixels commands then I will not miss overlay planes at all.

    One thing that these programs do is they only redraw the dirty parts of the screen. As you're dragging a rubber-band line across the screen, only a sub-rectangle of the image needs to be refreshed, and this can be substantially faster than refreshing the whole screen.

    thad

  • Anybody got an reference to the official word on Maya for Linux? I haven't seen it mentioned in a while...
  • The Nvidia drivers are actually under the XFree licence, not the GPL.
    --
  • Doh. Even the original TNT has really good RGB overlay support. (Trust me, I know its they only way to get page flipping in a window under DirectDraw. And on my TNT that works). I think all cards out know have YMUV overlays because windows uses them for video playback. I'm certain that the GeForce would also have good overlay capability. A good way to check if your card has it is to get the DirectX SDK, boot into windows, and look at the DDCAPS info.
  • Well, 3dfx may release the V3 specs sometime soon. (I expect to see them around Q1). Real specs (Hello nVidia!!)

    I think Matrox and ATI are currently ahead of the curve as far as being good to us open sourcers.

    I currently have the nVidia TNT, Permedia2, and the VooDoo3. The V3 rocks em all, 2D. I'm waiting for documents from 3dfx so I can start coding (Hello, DRI anybody?) for XFree86 version 4.0.

    If I can find the documentation, I'm going to look into the Rage and see if it works on the Alpha.

    Pan
  • I know you might think, its a hardware driver, if it works, why would you need open source drivers.

    But hardware drivers for 3d cards are almost never perfectly, and it is common for a company to release several drivers before they are usable for most games. Especially opengl drivers, which seem to only be made for the sake of Quake*.

    Think how half-ass the drivers will be for linux. And do you really think that they will fix bugs in a timely manner, for an alternative OS. Hell no.

    For an alternative OS, you need programming specs, and maybe some open source drivers to accompany it. Even without programming specs, you can get a few bugs fixed (like the NVidia OGL driver from the glx cvs server does not have that XScreenSaver bug, but it is slower than the released version). However, without specs, something that is just open source like the NVidia tnt drivers, can't really be updated. That is why the TNT is slower than the G200 on linux. That is why open source drivers don't matter, but specs do.
  • Well, if you want windowed, then you should probably go with NVidia or Matrox. Note that matrox glx drivers are open source and are being actively developed ...
  • Hey, don't bitch about the DSL... You will eventually get it. It could be worse. If I lived 400 feet down the road, I would have cable modem access for $25 a month instead of the $20 a month for 28.8k (sometimes 31.2k!!! :/ ). I am serviced by a different cable company which outright SUCKS by comparison.
  • Ouch. That does hurt
  • It is great to see that kind of support for Linux from Nvidia. I'm still waiting for my TNT2 to arrive. I hope that there will be good drivers for TNT2 available for Linux/Mesa soon (with Xfree4?)

    Greetings,
  • The specs look good, but at what price and when.
  • Maybe with all these manufacturers releasing drivers, the Winmodem makers will follow suit and release their drivers, and we can finally get Winmodems working under Linux
  • it looks like a great chip, and, as it seems to be using much of the same technology as the GeForce, I am sure it IS a great chip....


    but did you all read the press release? it is going to be placed on some very high-end boards... it is competing in the price-class of $1000 videocards....

    even if the chip is not so expensive, the board will be, the older ELSA high-end cards were VERY pricey...

    too bad really, because, just like all the other members of /., that is a card I would love to have...

    We are all in the gutter, but some of us are looking at the stars --Oscar Wilde
  • They didn't say whether the Linux drivers will be GPLed, or even some Open Source(tm) variant. With all the noise about OS lately, you'd think they could at least TRY to jump in the bandwagon... unless they've already decided not to, which would be a Bad Thing. Maybe they don't know about the /. effect yet...
  • They're not dropping NT support any time soon, I see...

    Hmm... W2K isn't the sure thing it's supposed to be, is it?
    --
  • by BJH ( 11355 )
    Well, since I'm the one that submitted this story, I did in fact read the press release all the way through, and the fact that NVidia would release drivers for a high-end board like this is the point that struck me as being most significant.

    What I'm trying to say is, by producing these drivers, NVidia is indicating that it believes that there are customers out there with enough $$$ to buy the boards, who are also interested in using them under Linux. Think about that for a moment - the companies that fork out big bucks for this sort of high-end hardware aren't looking to play Q3.

  • overlay planes are needed by many high end apps,
    and if these cards dont support overlay they may
    not get far.
  • Of course not. The NT market hasn't reduced one single percentage point, it simply has stoped growing so quickly. It is *still* the biggest market by far.
    And yes, wether we like or not, w2k is a sure thing: Anything with support for it will allwys sell a hell of a lot better than something wich doesn't.
    Don't let Linux's latest victories get to your head folks, the war hasn't even began seriously....




    No, I can't spell!
    -"Run to that wall until I tell you to stop"
    (tagadum,tagadum,tagadum .... *CRUNCH*)
    -"stop...."
  • by Anonymous Coward
    who cares if the drivers aren't open source? at least they're supporting the linux platform.
  • by pb ( 1020 )

    Athlon-optimized OpenGL drivers for Linux... How could life possibly get any better? Can you say "Quake III"?

    Oh well, I hope I get a video card that's half as nice as this in a year or so...

    Maybe one day this sort of card will inspire a Tom's Hardware comparison that includes using NT as compared to Linux. :)

    ---
    pb Reply rather than vaguely moderate me.
  • A little bit of info at Thresh's FiringSquad [firingsquad.com]

    It has some information about Quadro vs. NV10, and even some CAD-related benchmarks against cards like the GVX1. :)

    Guess who won..

    -Warren
  • too bad really, because, just like all the other members of /., that is a card I would love to have...
    Hmm. According to FiringSquad [firingsquad.com], you wouldn't necessarily gain performance in say, Quake 3.

    See, most 3d games (such as Quake 3) focus on speed and texture detail, thus giving way to low polys and perdy textures. However, CAD generally requires higher detail model manipulation ahd whatnot..

    -Warren
  • I don't exactly know when, but I read that the 64 meg version of the Quadro will cost about US$800.00. :)

    -Warren
    one must wonder what is wrong with society when 14 year old boys are listening to Propellerheads mp3s and posting to slashdot at 3:30 in the morning..
  • by EvlG ( 24576 )
    I saw preliminary pricing for the board; the 64MB version is expected to go for ~$800 if I recall correctly.

    I see some people complaining about prices here. While it's not the $250 price point of the GeForce, it's still very reasonable/pretty inexpensive for workstation-class graphics.

    Nvidia has now expressed an amazing commitment to Linux across the board with their products. Let's hope SGI would do the same with their 3d Software, Maya on Linux would be oh so cool :)
  • See, most 3d games (such as Quake 3) focus on speed and texture detail

    That's only because the previous generation of hardware couldn't handle high polygon counts. Build the hardware, and the software will follow.

  • Mesa would not presently pass OpenGL certification, check out their [mesa3d.org] homepage and click on the CONFORM [mesa3d.org] link there. Mesa is more feature rich than the MiniGL drivers but its still not a feature complete or even 100% accurate in the features it does support. Mesa != OpenGL, in fact presently Mesa OpenGL. This isn't a flame against you or Mesa, but is just a statement of the facts as they presently stand.
  • My friend just ordered his 2 weeks ago and is awaiting its arrival. Will we be able to dl a driver for this bad boy and get it working?
    Natas
  • tell that to the developers of maya and
    softimage (anyone still use softimage?)
    it makes a big difference with artisan.

    all the modern hardware use 8 bit planes.

    im also waiting for good hardware for 2d gl stuff
    in linux. until then ill put up with irix.
  • From what I've seen on the lists, the TNT cards are generally regared as faster than the Matrox cards when it comes to 3D.

    But - and this is a big ol' but - Nvidia hasn't played quite as nicely as Matrox when it comes to releasing specs. So, the GLX guys have been able to optimize the heck out of the Matrox driver, and the Nvidia driver hasn't gone as far.

    In fact, John Carmack has more or less stated that he's personally focusing on development for the G400 because the specs are there and he likes to program the *hardware.* This is kind of a bummer, because at this point, the TNT cards are 2 generations old - how many super-secret secrets can be left in it?
  • I am a 3D designer and I've been waiting patiently for a very good 3D card to come out for Linux. I know that currently some of the 3D labs chipsets are currently supported, but full hardware support of OpenGl isn't that great. This introduction of the new chip from nVida will help boost support for 3D apps for Linux. With the addition of Houdini to the list of 3D apps for linux put's us on the map and with this card coming.

    This will hopefully convince Avid, makers of SoftImage and SGI, makers of Alias PowerAnimator and the ever so popular Maya, that linux is a viable platform to do 3D work. I think if the price is right i.e. around $700 US or less then I'm gonna buy it.

    As far as image quality, as one of the previous posters was wondering about, the quality of images displayed may be the same as the Matrox card, but the thing you have to consider is that this new card can open a 3D project in Maya with about 1,000,000 poly's or more and you will have no delays when moving around, where as in Matrox's case it will be chuggin along and have you waiting till the cows come home.

    Some of you want to know what makes this board "workstation" class. Well I think that the benchmark numbers speak for themselves. Look at the comparison between the Intergraph Wilcat 4000 and the Oxygen GVX1. The Wildcat has geometry accelerators which means that the processor don't have to transform the geometry, but the board can handle everything. If this is included on the card that Elsa is making for the Quadro, then definitely we will be seeing support for more 3D apps in the near future.

    Who could forget about gaming? Well the board should support fully OpenGl games, such as Quake 3. Most games say the support OpenGl, but that's not full OpenGl, but use the OpenGl implementation like the one 3DFX makes. So if you plan on playing games that have full support for OpenGl then your frame rates should be extremely good.

    I know one thing, I'm gonna get the Quadro as soon as it comes out, and first thing I'm gonna do is fire up q3 and let the fraggin begin =)


  • Well, check out the Customize Homepage link under Preferences - it allows you to set the timezone.

    Of course, you have to have a login first...

  • If you want to play 3D games NOW and don't want to wait for months and months for XFree4.0 or promissed drivers, use 3DFx (Voodoo3?).
  • Yup! SGI partnered with Nvidia last year, and had engineers work on the GeForce256 and the Quadro. There are major implications downstream for SGI and Linux in the graphics workstation market, using Nvidia-based graphics cards.

    Over the past few months:

    SGI has openly licensed its XFS journaling file system to linux, paving the way for Linux integration on SGI hardware.

    Along with Redhat, SGI is funding Precision-Insight. Precision-Insight hired Brian Paul, the author of Mesa OpenGL port to Linux. Precision Insight's Multipipe Direct Rendering Infrastructure (DRI) within the upcoming XFree86 4.0 X Server. SGI is also providing extensive technical help and other resources to benefit the project. The DRI will include additions and modifications to GLX, Mesa 3.1, and XFree86 4.0, as well as any required modifications to the Linux kernel. Both Red Hat and SGI have agreed to allow full source for the entire project to be donated by Precision Insight to the open source community.

    SGI is shipping servers with Linux, adapted from Redhat.

    SGI has had a rocky time recently in terms of profits, but the technology is first-rate, and they are leveraging a strong Linux future, probably replacing IRIX, for x86-based workstations. Think about boxes with 1GHz+ Athlon, Coppermine, and Merced processors, and video cards like the Quadro, outperforming graphics workstations that cost 10 times as much. All of this is great for Linux.
  • PS, the first open demo of the GeForce256 was jointly sponsored with SGI, and featured OpenGL API god Dave Shreiner. Stop assuming the worst from Nvidia in terms of Linux support--SGI is taking some serious steps to turn Linux, with Nvidia chips, in to a serious graphics workstation OS.
  • It's unfortunate that while 3D hardware is nice and cheap, the nice-and-cheap kind is designed to throw many frames/second of low-polygon models at the screen. The benchmark is Quake III, and that means the most important features on a cheap 3D card are fill rate and texture-map speed.

    Higher-end cards, those designed with more advanced features like geometry setup and anti-aliasing, are much more suitable targets for whatever 3D-like user interfaces eventually arrive. You can count on such interfaces to make use of high-precision models, high polygon counts, and almost no dependence on texture-mapping (or even fill-rate, for that matter).

    The key is _detail_, and that will require very high resolution rendering of anti-aliased models in very large memory spaces. Hopefully, NVidia's entry signals a new era for high-end 3D graphics pipelines, one of increasing affordability.

    MJP
  • I am aware of this. The GeForce aka NV10 is the first video card to support high poly counts while still giving the high framerates.

    -Warren
  • You're obviously somewhat ignorant of the advantages of the GeForce. The GeForce is:

    1. Faster than a TNT2 Ultra (significantly)
    2. Has much better image quality (John Carmack recommends that you type r_subdivisions 1; r_lodcurveerror 10000; or some such)
    Basically the GeForce has a higher clock speed, and it also takes a lot of work off the CPU while going through its processes and whatnot.

    -Warren
  • It's GeForce!!!!!! Aaaaaah!

    Ahem.

    The Quadro is basically a GeForce w/ higher clockrate and some hardware acceleration features, as well as having more RAM (there will inevitably be GeForces w/ 64 megs of RAM, .18 micron die, etc.)..

    -Warren
  • winmodems have their name for a reason

    Obviously, if they had reasonable names, they would be called Losemodems...

    --

  • apparently that's the same attitude sun is taking about its "open source" SW.. and now all the Open Source advocates are saying things like Sun "doesn't get it". So there.

    As to the original poster, yes, f33r the /. effect.

    -Warren
  • who cares if the drivers aren't open source? at least they're supporting the linux platform.

    I care. Real Networks support the Linux platform with their binary-only RealPlayer. Adobe do the same with Acrobat Reader. That doesn't let me run either of them on my Sparc Linux box, though...

    Let me spell it out for you one more time: Linux != Intel

  • under $900 see http://www.elsa.com/AMERICA/PRE_INDX.HTM
  • so how is this "high-end" card different from GeForce? Is GeForce not high-end any more? And what exactly is meant by "workstation-class"?
  • who cares if the drivers aren't open source? at least they're supporting the linux platform.

    Look, I'm not tearing my garments: this is an area in which 'closed source' is relaitvely harmless, but Free is Better. If you're interested in hardware drivers, having the source code means you can
    a) learn something,
    b) contribute: maybe users can perfect a driver.
    c) Port it to another platform( say *BSD).
    That, and good PR, is why they should release the drivers free.
    That they support the platform is good, but only as a sign of the recognition Linux is getting as a widely used OS. With OS drivers, they would be supporting the concept behind it, which would make me way happier.
  • Well, its a press release on the hardware, its not a press release on OpenGL so its not too suprising. If they're going to have real OpenGL for Linux I doubt very much that it will be Open Source. It isn't there property to release as Open Source. There's also the fact that the terms for calling yourself OpenGL require your driver to pas a slew of compatibility tests so any derivative works would need to pass these in order to remain OpenGL.
  • Non-opensource drivers do nothing to "support the Linux platform". They may be a conveniance to people who's system can run them, but for everyone else who runs linux they do no good.
  • Unfortunately, I've heard that NVidia hasn't released many specs on their cards other than a single version of a video driver. I think it's been updated slightly, but it still doesn't have any of the fun features, I guess..
    --
  • The GeForce was never a high-end vid card. High-end cards cost over $1,000. The GeForce was a high-end gamer's card, but not a high-end video card.
  • How many pins does the GeForce 256 have?

    The Quadro has 388 pins.

    The Quadro claims 17 million polygons/second.
    The GeForce claims 15 million polygons/second.

    How large is the performance gap between the two?

  • Actually the GeForce has a lower clockspeed than the TNT2 Ultra. It is not even faster in some situations. But, and it is a big butt, the GeForce has an onchip transform engine which offloads your CPU in high polygon count scenes.
    That is thhe big deal with the GeForce.
  • Well the TNT2 drivers are GPL'ed and IIRC they're implementation of OpenGL (for the TNT2) uses Mesa, so there's no reason to think they'd release these drivers under anything that wasn't GPL. I think they've gotten the picture when it comes to Linux drivers.

    -----------

    "You can't shake the Devil's hand and say you're only kidding."

  • that's correct. That's why the drivers are optimized for athlon too :)
  • It's good to see Linux support for new hardware coming out so much sooner. I remember the early days before I was Linuxized, reading about having to write your own device support because manufacturers had never dreamed of their stuff being used on anything other than Windows of Macs.

    Still, I wonder if the nVidia coding for OpenGL will be applied to any of their earlier boards. I have an NV1 (Diamond Viper RIVA TNT V770D) board that required a driver download before I could get anything other than 320*240 on.

    --

  • I have a TNT2 (Diamond v770 Ultra) and it's
    fantastic under Linux. I love the 32m of
    texture memory, but 64 would sure be nice...
    I do OpenGL development and it suits all my
    needs.

    I highly recommend it.
  • I have a G400 in one box and a Leadtek TNT2 Ultra in another. Like most cards these days, they both are very good for 2D work, easily capable of exceeding my monitor's 95KHz horizontal sync limit.

    What's more, there are GLX libraries available for both chipsets -- still binary-only (I think) for nVidia, open source for the G400 -- so you'll have hardware-accelerated Mesa in either case. In my experience, the nVidia GLX library is more stable than the one for the G400. The G400's also has some problems with texturing -- the 'superquadrics' mode for xlockmore will demonstrate this. On the other hand, the G400's implementation feels slightly faster than nVidia's. I haven't measured frame rates with the G400 but the TNT2 generally posts scores of 60-80 frames per second for the 'ssystem' OpenGL solar system program if you turn off the on-screen HUD (seems having text in the window slows things down to single-digits). I would expect the G400 to post numbers in the same range.

    Next time I have the G400 in the Linux box, I'll measure it. Right now, the G400 is sitting in a Windows box where I can watch an occasional DVD -- the DVD player that ships with the G400 is better than the PowerDVD software that shipped with the TNT2 Ultra.

    One thing I noticed with the GLX drivers, and the one for the G400 in particular, is that if the screenblanker kicks in while displaying 3D, you're in trouble. The machine is still usable if you login remotely, but the local console/keyboard is hosed until you reboot. Moral of the story? "xset s off" :-)

    The kicker, though, is in the price. You can pick up a 16MB OEM G400 (what I have) for around $100. The best price I've seen for a TNT2 Ultra is around $170. For $100, the G400 is hard to beat.

"Our vision is to speed up time, eventually eliminating it." -- Alex Schure

Working...