Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
X GUI

Nvidia Releases Xserver and GLX for GeForce 256 157

rmmeyer writes "Looks like Nvidia has finally released a GLX driver and XFree86 server for their high performance video board, the GeForce 256. I've been waiting with bated breath for this to come out since Linux support was announced WAY back before the chipset was released. Found the info on Linux Games "
This discussion has been archived. No new comments can be posted.

Nvidia Releases Xserver and GLX for GeForce 256

Comments Filter:
  • I just ordered a system with GeForce.. without checking linux compatabilies.. good for me then :)
  • I've been waiting for these drivers before I sunk the cash into an Athlon, 'cause I love Nvidia graphics boards... This'll be my 4th.
  • by bain ( 1910 ) on Friday January 07, 2000 @07:35AM (#1394147) Homepage Journal
    I grabbed it at first sight ! :)))

    Damn .. I love my GL screensaver ... *grin* ..
    It crashes my X once in a while .. when trying to run some GL stuff .. but it's better then nothing :)

    can't wait ot get unreal Tournament and Quake Arena on here

    kudus to Nvidia :)

    bain
  • I have been waiting for waht seems like forever for this. Now I just hope it works. Well,, gotta go, time to buy a new GeForce!!!!!
  • by Anonymous Coward
    Don't get too hyped up, these drivers are still very slow and don't nearly take advantage of the card, they are waiting for Xfree4.0 to write some decent drivers, they should have just wrote a fully accelerated GLX driver like the ones in Utah-GLX.
  • I am so happy they released a new version to match 3.3.5. I had to stop using the nvidia 3.3.3 Xserver when I got my Wacom tablet because thier server didn't support the loadable module properly. I think I might just upgrade my tnt2 to a Geforce after all.
  • Will that allow me to play Quake3???? Someone please try I don't feel like installing it for nothing.
  • I've found the TNT2 Ultra to be a VERY fast card for X11; and I am glad to see the GeForce being supported as well. I'll have to see if the server will also work on the Quaddro, their "professional" version.
  • Can someone please post either here or on the Loki Portal Page [linuxgames.com] message-board under drivers/gl? Im eager to get a GeForce!
    Natas of
    -=Pedophagia=-
    http://www.mp3.com/pedophagia
    Also Admin of
  • Well, I've been using Utah GLX (glx.on.openprojects.net, but it's probably something different now :), and I could run the Q3A demo so-so on my TNT2 M64. I only got about 15FPS...
    I'd try these ones out if I had my TNT2, but it's currently being replaced since it was defective...
  • by Maul ( 83993 )
    If one thing brings X down, it is that lack of support from the chipset makers into allowing there to be open source drivers.

    Recent moves by 3DFX, and others to release Open Source drivers will hopefully make XFree86 4.0 much nicer than the 3.x releases in terms of supporting top of the line 3D cards well.

    And also efforts of id and others to port games to Linux has helped chip makers come around and provide drivers.

    "You ever have that feeling where you're not sure if you're dreaming or awake?"

  • I looked at the FAQ and it says that "running Quake III is not recomended" and to wait for Xfree 4.0 with DRI to run QIIIA efficiently.
  • Hmmm. Anyone else besides me wish this were fully backwards compatible with the TNT2 and TNT cards?


    Chas - The one, the only.
    THANK GOD!!!
  • As it said, this is not the holy grail DRI X 4.0 implementation yet obviously... and it still can't even take advantage of all of the card because of this. I just tried it out and the GL is slower the utah glx with X 3.3.3.1 ( nVidia's previous dynamic X server ). That was ok because I ditched 3.3.3.1 because licq would crash on a threading problem. So I started this up, but licq crashes even though it's supposedly 3.3.5. I can't work without licq unfortunately.... I'm running the regular 3.3.5 bin from xfree86.org now which is ok but I can't seem to get any of the ttf patches working with it ( i miss 3.3.3.1 only because nvidia's was patched to use ttf ). The new nvidia X bin isn't patched for ttf this time is it?

    X 4.0 is basically the answer to all our 3d needs ... glad to see that they got some more funding recently.
  • by Xemu ( 50595 ) on Friday January 07, 2000 @07:50AM (#1394160) Homepage
    Will that allow me to play Quake3????

    Read the FAQ before asking -- it specifically says playing Quake3 on this preliminary driver is not recommended. XFree 4 will implement the DRI infrastructure needed for good Quake 3 performance. Quote:

    Q. How do I run Quake 3 accelerated? A. Due to the high demands Quake 3 puts on the client/server architecture of this implemetation, running Quake 3 is not recommended. XFree86 4.0 will have a direct rendering architecture needed to use the 3D hardware effectively with Quake 3.
  • Which is, for those who have never been in the gallery of a parliament, whether UK, [parliament.uk] Canada, [parl.gc.ca] or otherwise, is a misspelling of what parliamentarians actually say is:
    Hear! Hear!

  • After reading the FAQ, it appears that TNT and TNT2 support has been improved in this version. To wit, my home machine with a TNT2 only had accelerated support with 15 or 16 bit color. Apparently, this version has improved that to 32 bit color.

    We'll see if my Q3 framerate gets above 24, but I'm hopeful. (I'm also glad to see that the licence for the source code is the same as the XFree86 licence!)

    --

  • It appears that the binaries are >1 gb...unless I read that wrong. Is that a little big?
  • by Anonymous Coward on Friday January 07, 2000 @07:59AM (#1394164)
    I was waiting impatiently for near SIX months for updated drivers from nVidia for this line of cards only to be let down once again. The improvements made to this set of drivers include 32bpp rendering an texturing improvements. The drivers are still slow and underdeveloped. The blame, so to speak, lies squarely on nVidia's refusal to release any register specs to their cards. They won't release DMA programming information either. John Carmack himself stated that if nVidia released their card specs, the utah glx list could probably hack together quality drivers in a few days. So what do we do? I guess we wait. In the meantime, we still don't have programming information, nor is nVidia likely to release it. What does this mean? I suspect it means that we'll be relying on nVidia for closed-source module addons to XFree for a while, until they become less paranoid about opening up.
  • Tom (from Tom's Hardware fame) just released a review of this card (DDR & SDR comparison), specs, and how it stacks up to the competition (G400 and MAXX). Worth a look here:

    http://www6.tomshardwar e.com/graphic/00q1/000107/index.html [tomshardware.com]
  • It seems NVidia finally understands that releasing their drivers for other platforms than M$ Windows will make them much more popular.
    I've waited about 9 months to get any support for the NVidia Riva 128 chipset on any other platform than Windows NT, because NVidia didn't released any specs on this.
    I have no bad word about the chipsets of NVidia, they are fast, not very expensive, and technologically up to date.

    After hearing this, I can't say anything else but:
    Good work NVidia!
  • ..the un-affected readers?
  • I have to ask why another Nvidia card? I've been all kinds of dissapointed by the performance of the cards I've owned and seen. The only one I've bought is a Riva TNT I bought a couple years back. 9 FPS in Nvidia optimized 1024x768 Open GL was not impressive. I recently put a PIII 500 behind it and got up to about 21 FPS with a ton of dropped frames. I added in a pair of VooDoo 2's in SLI and got 60 FPS+ in Glide. If I was gonna buy a Geforce card I wouldn't bother unless I could get a DDR card. This is the only one putting out respectable benchmarks. My money is on 3Dfx. I'll wait for VooDoo 5.
  • Me too!

    Now, are you aware that there have been problems between Athlon motherboards and GeForce-based cards? There are fixes, but they reduce overall performance. I'm waiting for those to be fixed, and for the cards using DDR.

    I'm wondering, have any slashdotters experienced the Athlon problems? With XF86 4.0 coming out soon and the DDR cards going to market this month, will it be worth it to get my quake3 fix ASAP on less-than-optimal hardware? I just can't stand waiting...
  • Woohoo! Only another year or year and half to take advantage of all that money spent! I can't wait. :)
  • They do give the sources to untar over your XFree86 sources. Looking at a couple of the files:

    Copyright 1993-1999 NVIDIA, Corporation. All rights reserved.

    NOTICE TO USER: The source code is copyrighted under U.S. and
    international laws. Users and possessors of this source code are
    hereby granted a nonexclusive, royalty-free copyright license to
    use this code in individual and commercial software.

  • Is there a website out there that is a collection of drivers for Linux for all hardware (similar to www.windrivers.com [windrivers.com]? It seems that if something isn't supported out of the box, you have to search through alot of message boards to find drivers for the device.
    It's great when the company has the driver on their website, but if they don't support Linux, then it gets really tough.
  • NVIDIA put a lot of thought and effort in their object oriented low level API they use on their drivers... I seriously doubt a quick hack on top of register level specs would provide a great performance benefit and you loose all the benefits of their API. (easy arbitration between different drivers accessing the hardware for instance) I dont doubt some of the developers even on the utah list suffer from the NIH syndrome, so they would want to do as low a level implementation as they could. But IMO its not a good idea. The only problem with the API is that they have not released the source. (well its been run through the preprocessor, still portable but not readable)
  • Did you use Megahal to generate this or something?
  • Hello,

    It seems like they released accelereated 3d drivers not onl gor GeFORCE 256, but also for the rest of their product line. Including Riva 128, Riva 128ZX, TNT/2.

    You can fetch these drivers from this link [nvidia.com].
    --
  • The Riva TNT driver was buggy, unfinished, incomplete , slow and a pain in a but to get working. Even if you get it to work, you are lucky to get half of the FPS that you get in windows on the same machine. Does Geforce glx suffer from this problem too?
  • My system is W95/Linux dual boot, but I have been running W95 exclusively lately. Why? In a word: games. I have several games I enjoy - Diablo (an oldie but a goodie), Homeworld, System Shock II, and of course Quake III and Unreal Tournament. If I had the means to run these games under linux, I would reformat my W95 HD, put the CD in the microwave, and never look back. I would dearly love to run Linux 24/7. But, facts are facts: I can't play most of these games at all under Linux, and those that can work do so slowly. Is there more to life than games? Yes. Do I want to be able to play the games I own at a decent framerate? Yes.


    So, when will, say, a GeForce DDR under Linux be able to give me QIII with 60FPS at 1024x768 resolution? Will Xfree86 4.0 do it? Will new Nvidia drivers do it? Will Wine do it? Or should I just get a Voodoo card and be done with it?
  • by lapdog ( 73128 ) on Friday January 07, 2000 @08:35AM (#1394186) Homepage
    From the FAQ:
    Q. What is new in the 3D acceleration module?
    A. The 3D portion of the driver has been in updated to take better advantage of the RIVA TNT/TNT2 products. 3D rendering in 32bpp is now supported and textures are no longer limited to be square powers of 2. Support for NVIDIA GeForce 256 based products has also been added.

    This is much better then the teaser they gave us in June. With the june driver and X 3.3.3.1 I would get around 14 fps in q3demo1. GLX module and X 3.3.5 gets me around 9. This driver with X 3.3.5 plays nice at 24. It doesn't sound like much, but its still an indirect rendering driver. I can settle with that till X 4.0 and DRI comes around. I'm just tired of nvidia getting a bad rep from the glx-dev folks.

    And it is much more stable (though opening the register specs, ala 3dfx and matrox would be nicer) then the older driver. But still, I send my thanks. And you should too. Now.

    Dave
  • I am the proud owner of both a TNT2 card (agp) and a voodoo3 2000 (pci). I use both of the cards and thanks to the fact that 3dfx has spent time and effort to have their video card work (and work well i might add) with linux, that is the main card that I use in my system. For my money, NVidia makes a better product (the GeForce rules all), but again does not have proper support for their product. So until I can get great (not mediocre) drivers for my TNT2 for linux, I will continue to support companies (3dfx) who do. And 3dfx makes a pretty decent product to boot.

    ~Jester
  • i agree that the tnt2 ultra performs very well for x11. i get around 25 fps in q3 which is definetly playable. i'll be getting a geforce now that its supported
  • The new ones lock X up solid whenever I run a 3D app. I deleted the old drivers and now they're no longer availiable on nvidias ftp...if anyone can findthem I'd be grateful...thanks.


    -W.W.
  • You can play quake 3, but I didn't notice any difference in speed from the drivers they released in the summer. The only major difference I can see from the TNT/TNT2 end is that they finally support 32bit color depth. On my K6-400, I'm getting about 15fps in quake3 on the "fastest" setting, which goes to about 8-10 when it counts, in a dogfight with someone else.

    A friend of mine has an Athlon 500 with the same TNT2 as me, and he gets 20-30fps on the normal setting. So YMMV.
  • Have you checked out the kernel source lately? try peeking into /usr/src/linux and you'll find most of the docs there.
  • I'm sorry, but what does nVidia's drivers have to do with buying an Athlon?

    I just don't seem to get it.
  • Yeah, but it says that about the TNT2 drivers as well, and I've found them to work just dandy for Q3. I think they're just hedging their bets.

    If you're not sure, I recommend downloading the Q3 demo and trying it out first. I'm betting it'll work.

    "Moderation is good, in theory."
    -Larry Wall

  • What hardware are you running besides your TNT2... with my TNT2 I haven't noticed _any_ fps change in quake3, If I could play at a 24fps average, that would get me through till X 4.0, but as of now I'm still getting the 15fps I was getting with the summer drivers.
  • I'm sorry, but you need to pay a bit more attention. nVidia has had glx-enhanced support for the Riva 128 since last June, and generic 2D support has been in Xfree86 since at least 3.3.4 (in fact, it should've be in 3.3.3.0, which first had generic TNT1 support).

    This anouncement should be nothing new to a Riva 128 owner. It's simply an update to 6 month old drivers.
  • by Malc ( 1751 )
    The explanation comes with installation instructions with the driver. The driver has not yet been optimised at all. The Glide API drivers used by the Voodoo Cards has had a lot more development time spent on it than nVidia has invested. Sure your framerate with the TNT was low at this time, but that will improve with the next major version of X.

    By way of comparison, I get close 100fps in Q2 under Windows (NT 4 too, not the speedier 9x) on my Celeron 366. When we get better nVidia Linux drivers, the framerate and graphic quality will put the (now very out-dated IMO) Voodoo2 to shame.
  • I built myself an Athlon 500 machine with a GeForce (the highend version) Pro? I had consistent lockups in 3D Mode.

    Ended up replacing it with a Voodoo3-3000 and had no more lockups. The Voodoo3-3000, while very nice, is no match for the video quality and speed of the GeForce card.

    I'll probably go back to the GeForce card if these drivers turn out to be stable. The video quality is just... STUNNING!

    Congrats to NVidia for getting on the ball and supporting their users!! Thank you.
  • Your best bet for now is to stay with Windows for games. Oh, wait, I'm sorry. Ignore that, I forgot I need to uphold the perceived superiority of Linux in all things.

    Seriously, though, if you want games, windows is the only way to go. There is nothing wrong or bad about admitting that Linux can't really handle games well, and may never be able to do so (is this really such a bad thing? IMHO, no, it's not). So, lose the anti-MS attitude and have fun with your games in win95, because if you take the holier-than-thou attitude and dump win95, but then complain about not having games on linux, you will have done nothing but screwed yourself.

    BTW, don't waste your cash on a 3dfx card. For reasons I won't go into depth about here (16bpp max 3d rendering, 256x256 texture size, etc), 3dfx has dropped the ball. Sure, they support their mediocre products fairly well in Linux, but is that what linux really needs? You'll have to answer that for yourself
  • Yes I need to ask Why too. The why i want to ask is why do you want to get an Athelon? Just get a Dual Processor P II/III MB. With the price of Dual Processor Mother boards being as low as they are currently, The Dual Processor MB + 2x PII 450Mhz will still be cheaper than Athelon MB + 1x 700 Mhz Athelon. Whats more, you can later upgrade to PIII 800 Mhz processors when the price is right. So when Intel and AMD are batteling out at 1.2 GHz, you will already have a 1.6GHz system, and at probably a quarter the cost!!! You get the speed now, and the upgradability later with dual processor configurations. Intels been trying to thawart people using their processors in the MP configurations, but what they don't realize is that this is their greatest strength. If they catch on to this and start marketing it, it will be the end of AMD (Unless AMD can come out with some really cheap Dual Processor MB's for Athelon.)
  • Shit! I just got a Voodoo3 3000 for Christmas. Oh well, maybe next year.
  • on my Creative Labs Riva TNT Quake2 and WolfGL no longer work and in the case of the latter, it locks up X. Quake3 works, maybe even a little better, but the earlier version worked with Q2 and WolfGL as well. Before anyone flames me, I am an experienced Linux user, I had no trouble getting the earlier version to compile and work. I tried the binaries, compiling it myself and I tried several XF86Config changes to include color depth, before I gave up. Maybe I'll try again tomorrow when I have more time.
  • Linux is already a snappy server and it's becoming a wonderful desktop OS. I use it exclusively for Desktop and as a Server too.

    As soon as Enhanced PnP and XFree86 4.0 are a reality, Linux will make new friends on the desktop. It's inevitable!

    I think XFree86 4.0 will be the biggest leap for the desktop in quite some time. I'm not holding my breath but... I'm so excited I can hardly wait. (but I will).
  • I couldn't play Quake2 under X with the previous release as it was too dark. Under NT, my Creative TNT drivers allow me to boost the gamma setting card (which I do before I play Q2). Before I waste my time trying out the new drivers (which apparently have caused some people to lock up), any word on whether this is still a problem?
  • they haven't released their API or their source. There is a API similar to the one they use internally that they post on their website, it has no more features than the GLX driver as far as 3d accelleration/card access (no gart, no dma).

    nvidia's GLX driver uses no low level API, it uses the card registers directly. The ideal solutions would be to release their low-level API of course, and let it be ported to linux (or at least publicly say they are making it XP for later release). If they had this library released, drivers would come out for new cards literally days from the release of a new API version.
  • by Schwarzy ( 70560 ) on Friday January 07, 2000 @09:32AM (#1394217)
    ok, fine. NVidia released a new version of X and GLX servers. but ... WHERE ARE THE SPECS ?

    Sorry nvidia but I want to drop my TNT2 for a G400 when I have enough money. I prefer to buy a more expensive video card having (almost) full Open specs even if on the paper the card has less power.

    Have a look on ATI cards. Few months after saying that they help the OpenSource community, you can find a project for ATI on the Utah-Glx page. I found also a page on TV and Overlay for ATI [XFree team is planning overlay support for Matrox too].

    I wanted to have a look on TNT spec to see how to use overlay for my TV card and .. I only found ugly software. I'm feed up of this open-closed specs you offer on your site.

    Hey ! Friends ! Drop your nvidia card for a card of a compagny understanding the Open community.

  • it should be no faster, whether you use a TNT or a GeForce. It does not utilize features of the card (like DMA transfer or GART texturing) so it will still be very driver limited.

    The two new features appear to be 32 bpp support and improved texture management.
  • bzzzt. Glide IS the driver. And they have open specs. and open source.

    take a look at dri.sourceforge.net and weep, AC.

    Pan
  • strange thing - I installed 3.9.17 this last weekend... where are these DRI drivers nvidia is talking about? shouldn't they be in XFree 4.0, thats what they are waiting for, isn't it? And I thought that 3.9.17 was going to be the last (or next to last) snapshot before 4.0

    I don't think they are written yet.
  • by Svartalf ( 2997 )
    "Hey, if anyone from nvidia is reading, I think it would take all of a single day or two to convert the existing nvidia glx driver to the same pseudo dma / real dma / direct rendering framework we have on the mga driver if some specs were released. It would be nice if we had unified functionality across all three chips, and it would be a major performance boost." http://list s.openprojects.net/pipermail/glx-dev/1999-December /002373.html [openprojects.net] is the link to the archive page for the message he posted. And he's right about that.
  • At least yours runs at all.
    Could you please tell me how you set it up? I can't even get X to work. It's low res and completley scrambled up.
  • The MGA driver, while it's not yet to the Beta level yet, is quality enough for them to ship the thing with Q3. That's going into a production release of a commercial product for sale. Isn't that quality enough for you? I'd consider it so, myself. It could take us a week or two max to get something in place. Shortly I plan on attempting to demonstrate this with the SiS or Trident chipsets (Yes, I KNOW the things are lame- but there's a lot of poor souls out there that are stuck with them because they're embedded on the motherboard of their cheap PCs.).
  • Excuse me...MIPS is tired?
    x86 has been around for how many years, based on the same technology that powered the tandy 1000sx sitting in my closet back home. And about the geForce being 10,000 times faster than top of the line cards in SGI, well, you're just smoking something. And another thing, SGI's power isn't in its cards, it's the entire architecture. They're designed for power, unlike your lowly PC.

    Linux is not the end-all/be-all of computing...sorry to disappoint you.
  • They may have come up with a real gem of an API (For Windows, maybe... I can't imagine an API that magically caters to the requirements of two radically dissimilar OS arch...), the problem with that is that without the specs, we can't fix the API- we're beholden to them for fixes, of which, NVidia has not been forthcoming, claiming they're waiting for the DRI release for the "real thing" from them (C'mon guys, DRI's out already- just make a DRI driver for 3.9.17!).
  • Ya know, when I bought my V3, I felt a little queasy compromising on OSS principles, but, dammit, I had to have acceleration in Q2 (I'd gotten far too used to it on my work machine to give it back up and end up in software mode on my primary gaming box). Nothing else appeared to be well-supported under Linux at the time, but everything was binary only.

    And they went and open-sourced glide and everything! Yeesh. Salved my conscience =)

    Let's hope nVidia learns from the example... Maybe with more 3D games coming to Linux (Quake *, UT, Heretic2, Heavy Gear 2, Soldier of Fortune), Loki and others can apply some pressure.

  • by Anonymous Coward
    Once again, a nod of the head to nVidia for releasing open source drivers. What a shame that in the grand scheme of open source, they aren't worth squat. Why is that? Well open source prides itself on Peer Review. And as nVidia *still* has not released specs for public consumption, the drivers cannot be improved.

    And again, despite my saying it many times, I urge people to support companies like Matrox who release specs for driver writing. The Utah-GLX driver has made amazing progress and has even outpaced the Windows 9x drivers on some Matrox boards. And the Matrox driver is more feature rich, with AGP and DMA support. And the Matrox driver even has a form of Direct Rendering for XFree 3.3.x. It's not as good as XFree4's DRI, but it's a speed boost.

    Support specs over drivers! Drivers are only as good as nVidia makes them.. Specs make better drivers.
  • Right now, I'm running X on an ASUS V6600 GeForce just fine and I never downloaded this server. I think Slashdot forgot to mention that the only thing really new here is the 3D stuff. Oh well, I'm happy. Now my spiffy new GeForce will be able to run 3D stuff in Linux! If only I had some 3D software for Linux...
  • Hey, loose the attitude!
  • I have a TNT, and I'm kicking myself that I didn't get a Voodoo 3 or Matrox instead. The nVidia cards don't offer an acceptable level of acceleration. It's a good start, nothing more.

  • I think you are a bit confused.

    From nvidia's website:

    The GeForce 256 GPU is more complex than today's CPUs providing unprecedented visual power for your PC. With Transform, Lighting, Setup and Rendering on a single chip, the GeForce 256 GPU delivers 15M polygons/second and 480M pixels/second of performance. Its unique 256-bit rendering engine enables an order of magnitude increase in visual complexity.

    from SGI's website:

    RealityMonsterTM multisubsystem rendering mode provides up to 210 million polygons per second and 7.2 gigapixels per second fill rate or 1GB of physical texture memory (with 16 pipes) for tackling grand-challenge applications.

    It seems that SGI's chip is 14x as fast.
  • why wait for hacked beta drivers in linux? use windows for now and take advantage of those exceptional framerates. Once again I ask for someone to post numbers showing that linux is faster running a 3d game than windows.
  • Sorry, but I never managed to get the console version working. I played with it for a while because I prefer playing full screen, but to no avail. One of my co-workers had no luch either. I haven't had time to look at in almost 6 months now.

    The very best that I could get was 320x200 without any OpenGl :(
  • Is it me.. or did I imagine this number.. 1646392 KB? They smoking something or did there site developer just think it would be entertaining to slap KB on a file size in bytes? I think the latter since I DL'ed it in a few seconds.. heh I almost had a heart attack when I saw the size..
  • OH - you want a "Tom's Hardware" comparison... Ok. Lets test GeForce 256 vs. V3 in Open GL 16bit. GeForce wil probly win, V3 not too far behind. Do the same in 32 bit color - oops, V3 can't do that (long rant why 3Dfx sucks), do test anywys to show the far superiority is a graphics mode no one will use. Lets not bother to do the same test is Glide. Why not? Oh the Nvidia card can't do that? Better not run the test, might make Nvidia look bad.....

    I'm not saying the VooDoo 2's are the best thing out there. Heck the VooDoo 3 will rock Voo Doo 2's most the time. I've used quite a few Nvidia cards. I said I've only bought a Riva TNT. I've used TNT2, TNT2 Ultra, and Fire GL cards. All of them left me with the "so what" feeling.

    As for your Quake performance, I don't doubt that. Read any hardcore review, the OEM's are tailoring the drivers to Quake. Any other Open GL app and the performance isn't near as great. Guess what, I don't play Quake III! Running around in a 10x10 room with rapid fire weapons with 10 other players is dumb. Id Software can never be Tribes, no matter how hard they try.....

  • The SGI wasn't made just for quake you newbie windows lamer, its meant for real 3d work. I'm sure the engineers at SGI put getting tons of fps in quake2 on their top list of priorities. Do some realtime modeling and calculation with the SGI and watch it wh00p the PC. Ever see the permedia2 chipset or firegl? They both suck at games but work great in programs like 3dsmax.
  • by Dor ( 93468 )
    It's reporting wrong. Just drop the K off the KB. It scared the crap out of me too.
  • disclaimer: I do not write code, therefor anything that I say may be BS.

    Can you reverse-engineer the code to get the specs? If not, they could get a few coders to sign NDAs for the specs, but release the code as OS. I'd think the OS code would satisfy the zealots, while protecting their proprietary specs.

    "My opinion may be wrong"

    "You ~may~ want to do a backup first" -Rex, 3rd level support.
  • The spelling. "Bated" vs. "baited."
  • Oh, come on. For most people, it's not about bragging rights (though that would be a bonus, of course) or a blind anti-MS attitude; it's about people who genuinely prefer to do their computing in Linux, and like to play games, and don't want to be forced to dual-boot just to play games.

    Personally, if I had the money for 2 high-end boxes (I have a Celeron 400 dual-boot, and a P-166 dedicated Linux box), and the space to set them both up at my desk, I couldn't care less about not having the 'Moral High Ground' of doing everything in Linux. I would just make one box into a game machine, and use the other for work. But dual-booting can be incredibly annoying, and Linux isn't inherently incapable of playing games, so why not try to have it all?

    Afterall, progress never came from being satisfied with the way things are.

    Chris
  • Well said. The only thing I do under W95 is play games and surf. I'd rather be able to run Linux exclusively, and I can't do that and play the games I have. Bragging rights or anti-Windows bias have nothing to do with it.


    As for my 'anti-MS attitude', I am opposed to Microsoft's corporate behavior and the effects of said behavior on the industry and me as a consumer. I also believe that Linux is vastly superior for things I want to do except for games, which was the whole point of my post. So, I use W95 for games, and look forward to the day when I can stay in Linux.

  • "whine and mumble"? You must have been reading something other than what I posted. So, AC, do you have any links to said hardware petitions?
  • A single Voodoo2 board outperforms even TNT2 cards. To make it worse, Q3 on the highest settings on a single Voodoo2 board _smoke_ low-settings TNT2 cards. Even with these new drivers.
  • by Anonymous Coward
    Why make such a big deal about this ?

    The support NVIDIA has made for the riva 128,TNT,
    TNT2, Geforce is pathetic. The 3d performance is
    horrible and the drivers still use PIO. They have
    not released open specs to do serious drivers, have not made a serious driver them selves and the source they have released is impossible to work with.

    While the TNT's and Geforce hardware is fastest the G400 and 3dfx blow them away on linux. Why ?
    Serious commitment to support linux and totally open specs.

    If your gonna buy go 3dfx or Matrox.

  • okay... just an offtopic comment about something I saw after your name. I don't know *why* the fuck people can't get it through their thick skulls, Slashdot source is available on the slashdot site.

    I would post the URL, but if you're that fucking stupid, you might hurt yourself.
  • Slashdot source from about a year ago is available here. It's broken, poorly written and feature-weak. My point is that the development of slashdot and the attitude of certain site administrators is basically that their code is theirs and you can't see it, or use it, or contribute to it, or help them develop it in any way. Does that sound like open source to you?

    I quote: "Every time someone asks me for the source, I delay the release another day." If, say, Apple declared that OSX was open source, and then consistently responded to questions about it's release with this sentence, how quickly would everyone here jump down their throats? But when it's The Great Rob Malda who does it, it's perfectly Ok to claim the source is open, when in fact it simply isn't.

    They can do any damn thing they please, but don't claim that slashdot runs on open source software, or that it's creators do any more than pay lip service to other people's open source efforts. The simple fact is, slash is a closed product that maybe, someday, if we wait and hope and pray, Lord Malda will see fit to hand down unto us.

    You can disagree with me about whether it should be open, or whether the community has any right to expect the slash crew to practice what they preach, but you cannot argue that slash is currently open source. That's just foolish.

    Oh, yeah, and I'm actually not "that fucking stupid." But thanks for caring.

    "Moderation is good, in theory."
    -Larry Wall

  • the 2D works really well. But the 3D (GLX) crashes on some things (especially the newave GlutDemo). Hopefully they'll post a fix soon.
  • Technically, all that 3Dfx has done AFAIK for Linux 3Dfx support is to port (and then open source) Glide, and to create the /dev/3Dfx driver. They haven't attempted to integrate their code into other existing environs, like X, which is something that NVidia seems to have done (If I'm reading this information correctly) As a result, anything that wants to use the 3Dfx card has to call Glide, which adds ANOTHER level of redundancy. If they had integrated 3Dfx support directly into glx, or Mesa, or whatever, than that would have been nicer. Instead, they supplied us with a proprietary API which you then need to wrap. IMHO this is a Bad Thing. It means that since we can't see what's going on, our undertanding of how the card works and of all its hidden options is limited by what we can see from Glide. Besides which, Glide sucks.

    What's more, if memory serves, they didn't write the original Linux version... somebody else did (Daryll Strauss?) and they later took control of it.
  • I have a P2 400 overclocked to 448, 128MB RAM with a Creative 16MB TNT. I recently got a hankering to play Half Life again, after an age of not playing it. In 1024*768 I'm easily pulling 30fps.

    I cant hope to achieve that performance in Linux, mind you, as the readme that came with the glx drivers states. The drivers are not optimised, they are for developmental purposes only and Linux will only reach the same level of 3D graphics performance when XF86 4.0 with DRI comes out.

    Personally I find the lack of 32bit colour on 3Dfx cards to be a crime. At most I have found a TNT to be 2 or 3 fps slower than a Voodoo 2 (which is a 3D card in the same class) in any given game, but it has a vastly improved level of image quality.

    Oh, and on a more on-topic note, this is good news indeed. My birthday is just around the corner so I should be picking up a 32MB GeForce DDR. I might get an extra 128MB RAM aswell, just because prices have come back down and you can *never* have enough RAM.

    Nick
  • There is so much that bothers me in this and succeding posts I don't even know where to start! First. 1. Linux OpenGL support is crap, since on TNT it is done trought GLX and even on Voodoo it is wrappered through Glide. Glide, however, is pure, unadultered direct rendering. So don't fault nVidia for faults in the OS. 2. GeForce whoops anything 3Dfx has so far. Even a TNT2 Ultra will beat V3 3500, especially if you turn up the detail. 16 bit color and 256 X 256 textures just don't cut it. 3. Sure, dual Voodoo 2s beat a TNT, but not by much. In Windows, only by 30% even in quake, and even less in other games. 4. Geforce is about a full 50% faster in quake 3 than any 3Dfx card, and will only get faster in newer games that use T&L 5. Voodoo 5 6000 will put out 1.3 gigapixels, yes but at what cost? 4 chips and 128 megs of RAM. And thas only 32 meg effective since all textures must be stored multiple times for each chip. I don't think they will fix that problem since it seems to be inherent in SLI (Rage Fury MAXX only has 32 meg effective out of the 64 on the board) If you put 4 GeForces together you get 2 gigapixels per second, and the GeForce Pro (which will be out by the time of Voodoo 5) will be even faster, (probably about 70% like TNT 2 was). 6. In Windows (the only OS that has good 3D, yet) nVidia has a full OpenGL ICD while 3Dfx is lopping along with its lame miniGL driver. (I think they might have an Alpha as hell ICD though) I could go on, but I think I have made myself clear.
  • Actually, since he is playing Quake, which only gets about a 40% boost with dual, a 700 MHz athlon will be faster. And even more so in non SMP games (which means all except Quake3)
  • The new TNT server and GLX module were built with some sort of alien C compiler that requires a bunch of symbols that don't exist in any sane C library I've seen. When I try to run their new supplied XF86_SVGA server:

    X: Error in loading shared libraries: undefined symbol: __rawmemchr

    It gets better. If I use my good old trusty XFree86 3.3.5 SVGA server instead of the one they provide, even their glx.so GLX acceleration module has missing symbols:

    glx.so: /usr/X11R6/lib/modules/glx.so: undefined symbol: __bzero

    What were these people thinking?! They didn't even BOTHER to test their servers and modules before releasing them. I certainly hope I'm not the only person to run into this seemingly-important issue!

    For what it's worth, my system is pretty-much-clean Debian 2.1 (slink) with libc6 2.0. I'm shocked that binaries this broken got shipped -- how could they even work on other distributions, requiring internal-looking symbols like that?!

    I didn't have anything like this problem with their old (slow, pokey, buggy) drivers. NVidia, we really appreciate the effort, but please, please, try testing in the future!

    Thankfully, NVidia supplied the source, so hopefully I'll be able to compile it myself and be rid of these nasty symbols. Bonus points for Open Source!

  • Do you know WHY ATI released specs to the Rage Pro? Cuz no one buys it anymore! It has been replaced by the current generation! Why did Matrox release the G400 drivers? Because matrox is a distant competitor in the consumer 3D market and wanted to do something to get the mindshare up. (BTW. I love the G400MAX) The cold reality is that nVidia is a market leader and thus has no real incentive to try to ride the Linux hype. (BTW. Believe it or not, 3Dfx is nowhere near as popular as nVidia. It may top retail sales charts, but most graphics card buying is done through OEM to smart individuals and system manufacturers. nVidia has a HUGE lead here, particularly OEMs like to be able to trumpet all the advanced features it has. Plus Voodoo 3 is basically the same card as Voodoo 1 so it is in the too old to matter bin.) Lastly, maybe nVidia has a good reason not to give register info. If you hadn't noticed, it give a good look into the cards internal workings, and why should nVidia give its superior technology out? Just get X4 done and make binary drivers work well. BeOS does and in its 3rd year on intel hardware, it already has manufacturers making drivers for it. Not only do manufacturer made drivers increase their quality, more drivers get made for different hardware. If you look at windows, the OS sux, but the drivers are really much better polished than Linux drivers usually are. Getting access to driver sources rarely does any good because A) If you are such a manly kernel hacker that you can beat people whose only job is to write drivers for one particular piece of hardware, you'd be working for them. B) You have to weigh the consequences. For every 1 person who ever hacked his own driver to fix a problem, I bet there are 10,000 who wished they had drivers for their card.
  • Works great with my Creative Annihilator Pro card

    I almost wet myself after installing the Q3 demo... and it just started up

    btw... every day I am more impressed with the Loki installer.

  • Yes, the same apply's here I have a TNT2 card. I wonder why Q2 no longer works, and when I use the blender it locks up X and my consoles. When I try to run Q2 I get, Received signal 11, exiting... Quake 2 used to run beautiful for me, much better than my old windows dual boot system with the same TNT2(that one locked up all the time, and I couldn't even load windows sometimes, I think that was something to do with the chipset) Shame is that I had Mesa and the GLX compiled against the 3.3.5 source so that was great, but I hastily deleted it. Quake III is ok, although It still doesn't run quite how I think it should, i've seen it run on a system almost exactly the same specs as mine on a damn M$ winblows machine, and it was flawless. So please if anyone wants to tell me why Quake 2 wont work I would be most appreaciative
  • They don't want to have to deal with supporting people trying to figure out the specs, so they choose to just not release them.

    Dave does seem like a reasonable guy, but I hate hearing this "we need to protect you" kind of stuff... Ok, fine, it's complicated. Let the XFree/GLX guys have a shot at it anyway. Fine, no support. Say it in big, bold letters, and ignore any inquiries. NO SUPPORT. Fine! I bet there are a lot of people who would LOVE to see the specs with no support... :/
    ----

  • was i the only one to notice that on the site of NVidia it says that the binary is almost 1.6 gig ?? Of course it is nog, they just put KB in stead of B after their figures...
    sjah
    mvg,
    Kris "dJOEK" Vandecruys
  • Try using a Debian distribution not >12 months old.

    These drivers work fine with my potato-based system and a TNT2 Ultra

    About 10fps faster in Q3A than the old ones...
    (still patiently awaiting XFree86 4.0)
  • I've had a lengthy email chat with Dave Schmenk (who worked on this driver, for the most part).

    Dave made a god job. It's not easy to write driver almost alone.

    Nvidia can't release the work they are doing on the XFree driver, because it uses a licensed version of OpenGL (not Mesa), so their hands are tied. I don't see a problem with this...Mesa is still not completely compliant (it's close enough, but I guess nvidia wanted to use their already-licenesed opengl).

    Sorry but NVidia release their work ... so I don't think it's a problem with licensing. Have a look on other projects too.

    The reason (that I got) that specs have not been released is that they are too complicated. Dave says that it takes new nvidia employees apporoximately three months of on-site training to get up to speed on the current register-level specs. They already have a kernel driver that is about as large and complex as the kernel itself (no wonder).

    I almost sure that NVidia is not the only one to do three months training for the specs. But here we have a proof NVidia doesn't understand the OpenSource community and philosophy.
    There are very, very clever people around the world. And with opensource driver, this is hundred of eyes looking on the specs and code, more than any compagnies will be able to have. This guys are able to quickly build a good quality driver (I have guys of the Utah-glx project in mind). If one people is in trouble with specs or/and code, he can find help around the world. You have always someone better that yourself.

    They don't want to have to deal with supporting people trying to figure out the specs, so they choose to just not release them.$

    Matrox and ATI also don't want to waste time with people trying to figure out the specs. OpenSource drivers builder don't need help of compagnies for working.

    Another word also on people LOVING to build driver. NVidia (as all compagnies) give money for having its employees working on drivers. It's quite different from OpenSource community where it's the love of programming that push people. And EVERYBODY KNOW THE POWER OF LOVE :-)

    Message for Dave: Take time to see Utah-glx project and the quality of the work based on Open specs.

  • mmmmm....Starsiege Tribes. Must fight temptation. Must fight temptation....

    Must not play too long...
    Must not play too long...

    Just an hour or so...

    Sleep? Who need sleep?
  • It works fine on my TNT w/ 16mb of memory. It's in fact worked so well I didn't leave Quake2 for 3 hours.

    Umm, that's a strange admission to make, but it does hold up compared to the previous release!
  • Since June yes, but I bought this card with riva128 halfway 1998, so I waited for the support about 10 months
  • HOW IN HELL CAN THIS BE CONSIDERED OFFTOPIC? FUCK IT!!! That's really pissing me off.
  • by Julz ( 9310 )
    Does the GeForce support two video out ports?
    I know you could run two cards for the same effect, but I was just wondering?

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...