NVIDIA Cg Compiler Technology to be Open Source 234
Jim Norton writes "This announcement from nVidia states that
their Cg compiler technology for 3D applications will be Open Source and available under a free, unrestrictive license. The ETA for this is in August and will be available here." The linked company release says it will be under "a nonrestrictive, free license," but does not give further details. BSD?
If you like it (Score:5, Insightful)
Money talks. If you like what they are doing, tell them you like it by buying one of their cards.
Re:If you like it (Score:3, Insightful)
Until then, I won't buy one of their cards. Period.
Re:If you like it (Score:1, Informative)
Re:If you like it (Score:1, Interesting)
Re:If you like it (Score:1)
Closed source is not evil, no. But in the case of these Nvidia drivers for *BSD, it's not exactly Angelic. The problem is that the people in Nvidia just aren't too interested in BSD, so they don't put resources behind the driver. Sadly, they can't just let the code go and be done with it, they have to maintain the thing just like the windows and linux drivers.
Anyway, they do have working drivers, which were done by 2-3 people who run FreeBSD there. But other than them, noone else there is interested, so the drivers are not likely to be released (2-3 people can't maintain the thing on their own while staying in sync with the other two drivers). Also, the two initiatives by non-nvidia people to get a working driver for FreeBSD seem to have died, one officially, the other just seems to have gradually slowed to a halt. :-/
Now if the drivers had been open, *BSD would have working drivers, and all NVidia would have to do is look after the Windows driver. My next card will be a Radeon, since the only reason I run Linux is because of the lack of nvidia drivers for BSD :-(
By the way, before anyone starts telling me I'm making things up, they really DO have working drivers, I found this out from an nvidia employee on OPN who joined #FreeBSD.
Re:If you like it (Score:1)
Re:If you like it (Score:2)
Remember - more and more special effects companies are moving to Linux. Not only in the rendering farm, but also at the artists workstations etc...
The 2D driver won't help much to the artists, and NVidia do want those studios to buy the latest and greatest graphics cards from them - specially the Quadro line which is their top professional card (and costs less then half of the really professional OpenGL based cards) - without drivers, those studios won't buy it. Thats why you see NVidia, Matrox and ATI promising drivers for Linux - and delivering them (specially NVidia - which keeps updating their driver).
Re:If you like it (Score:2)
Now the people doing the modeling..well they're probably not using linux anyway. Most likely they've got a nice high-end SGI box in front of them.
This just really isn't very important right now.
Re:If you like it (Score:3, Informative)
Why NVidia will never release driver source (Score:2)
I known someone who once worked for MetroLink. He was part of the team that was writing the NVidia device driver for Metro-X. They were a source licensee, under NDA, yadda yadda, so they had access to the NVidia driver source.
He said that the NVidia driver source is highly coupled with the chip design. Apparently, the NVidia driver people have intimate knowledge of the hardware design, and take advantage of it. This lets the driver exploit as much of the hardware's potential as possible. However, it also means that the driver has specific knowledge of the hardware design.
Given that NVidia's sole business is chip design, you can bet that they will never release source for that driver. It contains too much of their business. (No, it is not a chip schematic, but that isn't the point. It contains enough to make their lawyers unhappy.)
For better or worse, that is the way it is with NVidia. If you do not like it, do not buy their cards.
Re:ATI Would Be Happy (Score:2)
BUT - do you really think that ATI or Matrox cannot reverse engineer the driver? Go ask Matrox engineer and they'll swear that NVidia reversed engineers Matrox's binary only driver for dual head and thats how NVidia got dual head (at least thats what one of Matrox engineers told me)
Re:ATI Would Be Happy (Score:4, Informative)
SGI has said, on a number of occaisions, that they are not at all involved in keeping the nVidia driver closed source. They have also stated that they are in favor of open sourcing the driver.
Dinivin
Re:ATI Would Be Happy (Score:2)
It's a wonder Matrox is still alive in this day and age. Sure, they have what is considered the best 2d card of all time, and that's nice for OEMs and a handful of windows-using graphic designers, but the margins just aren't there. They lost the 3d race more than 5 years ago. ATI is barely keeping up and their driver quality is too poor to help them win.
Nvidia just owns the 3d market now, have for the last 2 years or more.
Re:ATI Would Be Happy (Score:1)
Re:If you like it (Score:1)
Re:If you like it (Score:1)
When a new card comes out, check nvidia's website and see if they have a link to it if it's another pre-order type of thing...
Re:If you like it (Score:3, Insightful)
Re:If you like it (Score:2)
Of course, ATI is missing a strong competitor to the 4Ti that the original poster referred to.
Bryan
Re:If you like it (Score:2)
Re:If you like it (Score:1)
In any case, saying that ATI blows nVidia's GF4 away with the 9700 series card (when it's not even out yet) is like comparing a Pentium 4 3GHz processor to an AMD Athlon 2200 even though the 3GHz version isn't out yet...
Re:If you like it (Score:2)
You're right, the comparisons are similar. Lots of people have 3GHz p4's: either early samples or overclocks, but the average Joe doesn't.
Bryan
Re:If you like it (Score:2)
me
but then again, what do I know.
NVidia's cards might be the best, if you define "best" as "most FPS in Quake". They're not "best" if you care about things like accurate color, stable drivers (several of my cow-orkers have shiny new laptops with NVidia chipsets/drivers that bring the things down every hour or two), etc. ATI still has them beat there, as do other manufacturers.
And yes, money talks. If people like something but nobody buys it, that something is usually considered a failure. In this case, sending a friendly thank-you note to NVidia along with your order is probably a good course of action...
Re:If you like it (Score:2)
Re:If you like it (Score:2)
Re:If you like it (Score:2)
Re:If you like it (Score:2)
My NVidia cards (TNT and TNT2) used to crash ALL THE TIME under xfree86 (3.3.x and 4.x.x) so I finally thought "Enough of this shit" and bought myself a Radeon, and I haven't looked back since. The drivers are very fast (I used to play Quake 3 and Tribes 2 fine on my Athlon 500 (now I have Athlon 1600XP) and it's nice to watch the DRI drivers mature and get faster & more feature complete.
Re:If you like it (Score:2)
As for ATI - well, their X driver is very good (written by non ATI people), but their Windows NT 4 drivers really sucks!
Re:If you like it (Score:2)
You have got to be kidding. nVIDIA is known for having rock solid drivers - I've never had a crash while running them, and most other people I know haven't either.
ATI is known for its poor drivers, and has been for a long time.
Re:If you like it (Score:2)
Re:If you like it (Score:2)
I can only pray that they're able to squeeze performance like that out of my geforce4 ti4200 in the future. I can already overclock the hell out of it but it's just not wise
Re:If you like it (Score:2)
Up until very recently, it would completely destroy your machine.
Re:If you like it (Score:2)
I totally reccommend them to other Linux users if you just want high-GeForce 2 level performance (they are almost 2 years old now.). But they are great for games like RTC Wolfenstein. Very stable too, as they were in Windows 2000.
Re:If you like it (Score:1)
Re:If you like it (Score:1)
Amen to that. (Score:4, Informative)
They have kick-ass products that officially support my platform of choice. 'Nuff said. :-)
Re:Amen to that. (Score:1)
I was born to rich family in rich country, so issues about growing hunger isn't my concern.
I'm going to die within next hundred years, so it's not my concern how badly we pollute our planet.
Re:Amen to that. (Score:1)
I was just trying to make a point. Not saying that source code access is as critical as access to clean food and water.
Many (most) people don't care much about software freedom, and I have nothing against them. They just disagree with me.
But it's strange if you don't care about free software just because _you_ wouldn't use the freedom's given to you.
I support freedom of speech, even if I don't say anything that government or anyone else would want to stop me from saying.
Re:Amen to that. (Score:2, Insightful)
Does that source code include a one MB file called "Module-nvkernel"? Tell me, what language was that file written in?
They're giving the program to me for free as in beer
Presumably you shelled out a fair sum of cash to buy an Nvidia card. That card will not work without drivers. I can assure you, you paid for the drivers when you bought the card.
I'm not a developer so issues about code modification isn't my concern.
Even though you might never exercise your right to modify code, it should still be a concern for you. You wouldn't be running Linux if it weren't for the ability to modify code. Developer or not, the ability to modify (and audit) code benefits almost everyone (it's debatable whether or not it benefits Nvidia more than keeping the source closed).
What happens when someone restrains a freedom that you want to exercise? Should I support those restraints because they don't effect me? Even if the ability to modify code never benefits you, that doesn't mean you should disregard other people's freedoms.
For the record, if Nvidia were to open source their driver, developers could port it to other operating systems, such as FreeBSD and AtheOS. The X11 side of their driver could be ported to other graphic systems, such as Berlin or the graphics system for AtheOS. The kernel side could be integrated and distributed with the Linux kernel. The X11 side could be integrated and distributed with XFree86. Their code could be used in research projects for new graphics systems. It is possible that Nvidia's GPU can perform operations that could accelerate other computations (perhaps image recognition, speech recognition, or some other project which the drivers were never intended for). Since Nvidia won't open the source, we may never know.
Re:Amen to that. (Score:1)
Since when...? (Score:2)
Re:If you like it (Score:1)
opensourcing the Cg stuff is a great way to get lots of people using it, and it gets into the press well, since OSS is all the rave right now.
still, it's a nice thing...
Here's a letter I wrote... (Score:2)
Re:Here's a letter I wrote... (Score:1)
Re:If you like it (Score:1)
Re:If you like it (Score:2)
Money talks. If you like what they are doing, tell them you like it by buying one of their cards.
But first you'd better understand what they're doing and what they're not. They are NOT open-sourcing their video card drivers. Until they do or somebody manages to reverse engineer the binary ones, their products remain proprietary. IMHO, nobody that supports Free Software should buy proprietary hardware that requires closed-source drivers. So it seems instead this Cg thing is just a language for programming shaders so you don't have to use assembly. Big deal. It's a step in the right direction to have a standard, but it doesn't make their products any more friendly to Free Software.
Could someone explain what's this? (Score:1)
Re:Could someone explain what's this? (Score:2, Informative)
Cg compiler stands for "C for Graphics"
People who write pixel shaders ( those nice little algorithms that makes games pretty ) for things like fog effects, lighting etc have to use some low level assembly ( which is sometimes tied to the card as well ). This will allow for a higher level language so you can use do loops, for next etc. with writing shaders and for possible expanded ( cross card ) support
Re:Could someone explain what's this? (Score:4, Informative)
I'm not sure if it will help, but you can read more about Cg Here [nvidia.com].
Re:Could someone explain what's this? (Score:1)
The many flavors of Geek, large and small. As long as someone believes in their heart that they are a Geek, then they're welcome here.
Geek to live. Live to Geek
OpenGL 2.0 shader (Score:1)
there was some debate on which was better 3DLabs or this as well as an ATI solution
anyone know more ?
but whatever happens Thank you
after all the chip business needs a reason to sell more chips and graphics is a big one the faster people can use the new features the more games/apps need powerfull chips
regards
john jones
Re:OpenGL 2.0 shader (Score:1)
Maybe this is a move to help stem off the patent problems with MSFT related to OpenGL and provide an alternative system.
Perhaps what I just said doesn't mean anything and I am confusing different things - is Cg an alternative to OpenGL or something else? I do not know enough about graphics to tell.
Re:OpenGL 2.0 shader (Score:1)
license (Score:1, Interesting)
Re:license (Score:1)
If MS helped develop it, no way will it be BSD.
Bill Gates saying how governments should use BSD-style lisences
Well DUH! Can't this be translated as "World's greediest person says 'other people should give me stuff!'"?
Gates thinks that other organizations should release stuff under the BSD license, so that MS can profit from it... you'll note that he's never said Microsoft should use the BSD license.
I'd think that it'll probably be something more like the MPL or Apple's version (ie. if you release changes, you have to give nVidia license to bundle and sell it.)
Nvidia's Cg (Score:2, Informative)
1. The vertex engine calls are not logical. Sometimes you call passing a referenced pointer, other time you have to pass a referenced strucute, some form of standarization to calls would have made it easier for developers to write function calls (more insane than POSIX threads).
2. The lanugae is not truely Turning complete. Which could have been fixed by taking some more time and making the language more complete.
3. The compiled bytecode is giving a security mask that disables it's use on chips that do not carry a compliment decoder (To keep competetors away?).
4. Confusing definitions of pointer/references. They could have made this easier by removing the entire pointer usage.
5. Class calls outside of friend functions can at certain times reach memory outside of parent definitions (Bad language design?! I think this is one of the most debated feature/bug, since you can piggyback this to implement vertex calls within lighmaps).
6. No SMP support in current implmentation and no thoughts to future support (What about threading?!).
7. Inlining support is bad and possibly unusable outside the scope of inling cg within c.
erm... yeah (Score:4, Funny)
i would really love to give some witty comments here -- but am at a loss of words. which could be fixed by thinking up a few words to form a witty comment with.
Re:Where are the e+'s (Score:1)
Now that was funny. :-)
Re:Nvidia's Cg (Score:1, Troll)
Class calls outside of friend functions can at certain times reach memory outside of parent definitions (Bad language design?! I think this is one of the most debated feature/bug, since you can piggyback this to implement vertex calls within lighmaps).
Agree, What I feel is that this is bad design, not a bug nor is it a feature. I have been testing on Nvidia's TestA hardware interface with cg and that has been the most annoying uptodate (along with the pointer disaster).
Cg should not have been done the way it was done. I for one would have welcomed them embracing an established language instead of creating one buggy one like this.
There was thought given into using Lisp intitally, but I guess the powers that be decided it should have to create a new and totally confusing language.
Re:Nvidia's Cg (Score:2)
Well, better a new and totally confusing one than an old and totally confusing one.
Yes, I've coded lisp before. But I recovered.
Re:Nvidia's Cg (Score:1)
isn't that the whole point? (Score:1, Informative)
I thought the whole reason they made a new language (Cg) is because the chipsets weren't Turing complete. If they WERE Turing complete, then it would be a complete waste of time to make a new language -- just make a new back-end for your favourite C compiler and write a bit of run-time.
However, the chips themselves can't do very much -- they can't do a conditional branch, for example. This makes it quite difficult to make a C compiler target them :)
It would be very cool to just be able to do gcc -b nvidia-geforce9 ... or what have you since you'd be able to take advantage of a rich existing toolchain. But, alas, it's not to be.
Re:Nvidia's Cg (Score:5, Informative)
This was done on purpose. Current (and next-generation) GPU shader hardware is not Turing complete, so it'd be quite silly for Cg to be. The problem is that while most extensions to Cg can be added with a vendor-specific profile, extensions which would make the language more general purpose (like pointers, full flow control, etc.) are apparently considered changes to the language design and only NVidia can add them. This isn't a problem now, but it would be if another vendor came out with a more general purpose shader implementation first. (Technically it may be possible to make Cg Turing complete through the extensions NVidia has made available, but probably not in a very friendly way.)
3. The compiled bytecode is giving a security mask that disables it's use on chips that do not carry a compliment decoder (To keep competetors away?).
Well, supposedly anyone can write a compiler/interpreter for Cg bytecode to target and optimize for their hardware just like NVidia can. (Of course they would need to introduce new functionality to the language through extensions, but the point is any standard Cg bytecode should execute on any DX8+ GPU with a compiler.) Indeed, this is one of (perhaps the only) huge plus to Cg: because it can be interpreted at runtime, rather than just compiled to shader assembly at compile time, new GPUs can (assuming they have an optimizing compiler) optimize existing shader code. This will be nice, for example, in allowing the same shader bytecode to run optimized on DX8 parts (few shader instructions allowed per pass), upcoming DX9 parts (many but not unlimited instructions per pass), and probably future parts with unlimited program length shaders.
Yes, it does require the other vendors to write their own backend for Cg, but NVidia has supposedly released enough for them to do that with no disadvantages. The question is whether they will want to, given that doing so would support a language that NVidia retains control over (as opposed to MS-controlled DX and by-committee OpenGL).
6. No SMP support in current implmentation and no thoughts to future support (What about threading?!).
Presumably this can be done via an extension, although it might get ugly to retain backwards compatability.
7. Inlining support is bad and possibly unusable outside the scope of inling cg within c.
What about inlining shader assembly in Cg? And beyond that, what sort of inlining would you want?
Re:Nvidia's Cg (Score:2)
On what basis do you make this claim? Turing (note spelling) completeness can be achieved in very simple languages (for example: Iota [ucsd.edu]) and judging by the Cg language spec. [nvidia.com] I can't see any reason to doubt that Cg is.
Was there something specific you were thinking of?
Re:Nvidia's Cg (Score:2)
Sorry but you are wrong. Simply restating your opinion does not constitute a rebuttal.
The Cg language is - as far as I can tell from reading the spec. - fully Turing compatible. Is it possible you don't know what Turing compatible is?
To be Turing compatible a language needs to support branching and looping. Okay its not quite that simple but this is the essential bit. Cg has both the if and the for () statements.
At best you could argue that some of the profiles currently supported, like dx8vs, don't support conditional branching. However even in this case the ?: operator is supported which means that this profile of the language (probably) is Turing complete.
Even if you disagree that the dx8vs profile is Turing complete, the Cg language defintely is because it does support the if conditional branch.
Re:Nvidia's Cg (Score:1, Funny)
It's no point posting on
Now, please get back to work
The guy sitting behind you with a real desk
Re:Nvidia's Cg (Score:1, Funny)
The guy sitting behind you with a real office and a real desk.
Re:Nvidia's Cg (Score:2)
The guy who owns NVidia stock (and owns you!)
Re:Nvidia's Cg (Score:1, Informative)
While there are no pointers explicitly in the language, you can effectively get pointer dereferencing by doing a dependent texture lookup. This is a common technique today with DX8 (e.g. reflective bump mapping) but so far it isn't commonly discussed as a "pointer dereference" or "indirection".
Also, in your comment you seem to be confusing the Cg shading language and the Cg runtime environement API, which are two quite-different things.
Eric
Re:Nvidia's Cg (Score:5, Informative)
AFAIK Cg is a C like language designed to make writing vertex and pixel shaders easier. Real time shaders for nvidia's and ati's are currently done in assembly. It is not supposed to be a new language like C or Python or insert-language-here. All it has to do are transforms on 3d vertex or pixel information.
A vertex shader takes as input position, normal, colour, lighting and other relevent information for a single vertex. It performs calculations with access to some constant and temporary registers, then outputs a single vertex (this is what the chip is built for). It does this for every vertex in the object being shaded. Pixel shaders are a little more complex but similar.
Points 1-7 have nothing to do with Cg.
There is a very good article on vertex and pixel shaders here [gamedev.net]
If gcc is the GNU c compiler (Score:1)
or gc^2 or gc**2 or pow(gc,2)
Re:If gcc is the GNU c compiler (Score:1)
BGriff
Nvidia's GLIDE (Score:1, Offtopic)
As free as the nVidia graphics card Linux modules? (Score:1, Insightful)
Hopefully that won't be the case with this.
Cooperation (Score:2, Interesting)
Does this really address the matter of concern? (Score:3, Interesting)
In my reading of earlier coverage of Cg, my understanding that most people weren't concerned about Cg or its compiler being open source, but rather that Cg would depend to some extent on hardware specs that are proprietary. This would have the effect of driving other hardware competitors out of business because they can't implement Cg components because of hardware patents. Sort of similar to fears associated with MS open sourcing part of C# while keeping a good deal of it dependent on proprietary stuff. The fear is that Cg would lead to people saying things like "well, your video card is so crappy it doesn't even support a standard graphics programming language" (all the while being unaware that the card can't because of hardware patents). Just because the language and compiler is open-source doesn't mean the hardware it will run on is.
Anyone more knowledgable care to comment? Am I misunderstanding this?
About the license (Score:1, Informative)
http://www.cgshaders.org/contest/
As you can see from the terms and conditions on that CG site, they favour and link to the ZLib license.
I think that CG will be under the PHP/ZLib license.
Vector C (Score:1)
Besides there are already C compilers that will turn your normal C code in to vector code. For PS2 and 3D-Now/SSE instructions. Check out codeplay [codeplay.com] for more info. Yes you have to pay for it. They don't have a compiler for the DirectX shading machine yet but this proves that they could. It's not like we have to invent a new language for every machine.
Re:Vector C (Score:2)
Re:Vector C (Score:1)
"source code", "non restritctive licence" and stuff like that means that oter gc makers don't have to make their own compilers.
Viability of CG (Score:1)
Re:Viability of CG (Score:1)
ATI RenderMonkey (Score:2)
However, there are some pretty good potential there, to make a Cg plugin for everything under the sun.
Controlling the Shader Language standard is almost as important as making a better video card, as you'll have a feature set your competitors have to follow - if Cg becomes the most popular language, then NVidia can say on their marketing material "GeForce 10: 100% Cg compatible, Radeon 50000: only supports 80%".
Re:ATI RenderMonkey (Score:1)
ATI's drivers are very good... when's the last time you gave an ATI product a try? I've been using a Radeon7500 and Radeon8500 and I never have any problems with the drivers. Also, what does "trying to push standards" mean? Isn't this what nVidia is doing with Cg?? How is ATI pushing standards?
but that it's from a company with a history of being OpenSource-friendly
And ATI is not open source friendly?? I'm running my Radeon8500 under linux with full 3d acceleration... sounds pretty friendly to me
Seems like just another layer to keep coders out (Score:2, Interesting)
For as long as I remember, the #1 complaint from the open source community has been the lack of open source X drivers, and the lack of documentation for directly accessing the hardware.
This still isn't direct access to the hardware is it? This is an API that goes through a compiler that translates things into machine code. Absolutely no real access to speak of.
Sometimes I wonder if nvidia cards are truly the hardware marvels that they are. Their implementation sort of reminds me of Play Incorporated's snappy video snapshot, where the hardware functions and bios get's loaded by an external program. I don't know if this is the exact case with nvidia hardware, but i'm pretty sure i'm not that far off the mark.
If that really is the case, it means that TNT2 cards are capable of all the neat tricks gforce cards only alot slower. I can see why you wouldn't want it opened up to the public. What's to stop a competetor from using the same hardware/software implementation you are?
I don't think it would seriously put a dent in the bottom line however. People tend to keep loyaltee's towards a company if it doesn't fuck their customers. Look at how many hits a day voodoofiles.com gets!
So be bold and daring like the new dorito's. Let other companies mimic your techniques, and try not to worry about the bottom line so much. If you let a bunch of open source guru's hack on your code, you could fire a few of those internal programmers thereby making up the cost. If you do this, anytime a relative, friend, customer asks us what 3d card solution they should get, we will respond NVIDIA.
yours truly
--toq
Re:Seems like just another layer to keep coders ou (Score:2)
Yes, you can do pixel and vertex shaders on the CPU, but it will make the application so slow as to be unusable.
Don't think that your 6 year old TNT2 card will become some magic speed demon if nVidia gives you driver source. It won't. Your argument is akin to saying, "Intel, give us the internals to the P4. I know I can make my 80286 run all new code if you do!"
Re:Seems like just another layer to keep coders ou (Score:2)
Did I say that? I thought I said...
If that really is the case, it means that TNT2 cards are capable of all the neat tricks gforce cards only alot slower.
Please read comments before replying, thank you.
--toq
Re:Seems like just another layer to keep coders ou (Score:2)
TNT2 doesn't have the transistors to do hardware transform and lighting. It can't do pixel and vertex shaders. Those can only be done in dedicated hardware or on the CPU. No amount of driver source code will change that.
Via proper software drivers (OpenGL and/or DirectX) TNT2 cards can *already* run games that use pixel and vertex shaders. It's just that since the card is offloading all of those calcuations to the CPU the programs are intolerably slow.
Please take your random thoughts to logical conclusions before posting insipid open letters to corporations.
Re:Seems like just another layer to keep coders ou (Score:2)
I won't flame you, you obviously don't know enough about hardware to make any logical conclusions yourself..
Re:Seems like just another layer to keep coders ou (Score:1)
In fact: you cannot even emulate the vertex and pixel shader path in software because there is no
way of inserting it into the correct rendering path on the TNT2.
You cannot emulate rendering 16 textures at once by rendering several times either because there is not enough framebuffer alpha accuracy to do it.
You're living in a land of make believe,
with elves and fairies and little frogs with funny green hats!
Re:Seems like just another layer to keep coders ou (Score:1)
Kinda sorta but not really. An updated driver for a TNT2 board could emulate in software all the silicon a TNT2 is missing. That's true regardless of what card you have. There are software-only OpenGL drivers out there.
Re:Seems like just another layer to keep coders ou (Score:2)
Well, I used to work with the guys from play inc. One of them basically explained how the snappy video snapshot worked.
Their custom chip was sort of combination of rom/ram and logic. The rom acted as a bootstrap for basic parrelel port communications. The ram would store code downloaded via the parrelel port, and the logic would chew on that.
Basically the snappy never really got any REAL upgrades to the hardware. (note this is where nvidia and play differs, nvidia adds faster hardware) Versions 1 2 and 3 of the snappy were all nothing more than "soft upgrades"
I think nvidia cards work in the same fashion, that's why we see such an performance increase between driver releases because the actual chip logic is loaded at boot.
1. Preboot, vga compatible mode
2. Boot, load custom OS specific hardware register code
3. Load OS specific driver for glue between the OS and the hardware (which is really software)
There is only so much you can do from calls to the OS for speed. If on the otherhand you could "soft upgrade" the hardware on boot, everytime you optimized that boot software a little more, it would stand to reason that the card would run faster.
So basically if you wanted to add that "gwhiz AA x4" feature to your card, you could write it in software, and load it into your card at boot.
Like I said earlier, nvidia open sourcing it would probably lead to a lot of the newer cards features being found on older cards, only a helluva lot slower. This too, is a reasonable assumption because the hardware is slower. It's no less capable of running the same code though.
Hope that clears things up.
Re:Seems like just another layer to keep coders ou (Score:2)
There is no way you could write a new driver for a TNT2 card that would allow it to do those advanced features. Give up the pipe dream. A programable pipleline graphics card != a simple video convertor box. It doesn't matter how much you believe that the hardware/software design behind a Snappy can be transfered to a video card, it just isn't going to work.
A question I'd like answered... (Score:2)
What I'd really like to know is, as they move forward to new hardware and new drivers and new technologies, will they do so with the free software philosophy in mind, so that they can be more open about their work, and help the community adopt their hardware on other platforms than Windows, Linux, and MacOS.
Certainly, if they release this compiler under a free license, then that's a good first step, because it could mean that they recoginize the value of free software and how it aids the spread of technologies to new platforms, not to mention how good free interfaces can become standards. Seems clear that NVIDIA would like to be the new SGI, settings the standard by which graphic innovation is defined.
Re:A question I'd like answered... (Score:1)
cf. Renderman/Maya compilation (Score:2)
Bryan
well of course it's free (Score:1)
CG Compiler is opensource, CG *IS NOT* (Score:2, Insightful)
C for graphics? (Score:1)
Does this mean I can segfault my video card now?
After all, it's not C if my first version of the code that compiles doesn't segfault immediately.
A Shader standard (Score:1)
The standardization on a shading language is going to push forward renderers two a new level, creating a massive pool of competition to Pixar's Photorealistic Renderman.
This is awesome! (Score:1, Offtopic)
I mentioned how CmdrTaco and friends will never defeat the trolls because they are some of the most ingenius and inventive Slashdot users. This is a perfect example. Way to go sllort! Thanks for proving me right!
All the changes to Slashcode will do is hurt the users who do not troll. Stupid Taco.