NVidia Releases Linux Drivers Supporting 4K Stacks 380
Supermathie writes "NVidia has finally released drivers for their chipsets and the 2.6 kernel that support 4K stacks. That means compatability with Fedora Core 2 kernels, people! View the README, visit their driver page, or download the package."
Real Story... (Score:5, Insightful)
Re:Real Story... (Score:5, Insightful)
An even better story will be when folks realize that it is OK for the whole world not to agree with them on philosophy. Especially when those philosophies have economic ramifications.
But I ain't holding my breath.
Re:Real Story... (Score:5, Interesting)
I wholeheartedly agree that closed-source code is appropriate for all manner of enterprises (and philosophically, I tend to look at executable code as an open, gloriously inaccessible book anyway). But closed-source device drivers? Just makes me wonder what they're hiding.
Re:Real Story... (Score:5, Interesting)
Several things:
1. There really isn't a way to verify that the drivers actually ship with the third-party code; NVidia may be using the issue to quelch requests for open drivers.
2. Goes to show how the license of the code you use in your projects can have determental impact on your future goals (or beneficial, depending on those goals, of course).
3. I think it's more likely that the drivers sources are kept closed because there's some benchmark tricks, or worse, cheats.
"the only real reason"? (Score:5, Insightful)
Um, no.
0) nVidia might not own all the code they compile into their drivers. The license they have the code under might permit binary distribution, but not source.
1) nVidia's drivers contain large amounts of software that is better than any of their competition. They spent money developing this, and they want to milk the competitive edge it gives them. And that is okay.
2) nVidia has more control this way. The Firefox guys are holding control over their cool icons, because they don't want the cool icons slapped onto broken code; only Mozilla-official builds of Firefox get the cool icons. nVidia might want to be sure that no one runs with broken drivers, then thinks nVidia cards are all junk, when in reality some guy made a few "improvements" that broke things, and distributed the changed version anyway.
3) Other reasons are possible. "the only real reason" my left foot.
Personally, I would much much rather have FOSS drivers. But even more than that, I want drivers that work. I switched from a GeForce 4600 to a Radeon 9600 XT, and even though the Radeon is a much better card, it runs slower under Linux than the older GeForce. It's the drivers. ATI's Linux drivers for the 9600 XT are lame. I actually boot into Windows to play Unreal Tournament 2004, because the performance is so much better under Windows. When I had an nVidia card, my Linux 3D gaming performance was just fine.
If nVidia would make a programmable-shaders card that doesn't double as a space heater, I would probably buy it and replace the Radeon. I know that the Unreal Tournament guys check the server stats, and I want to be "voting" for Linux gaming, so I want them to see me running Linux when the check stats on the servers I have been visiting.
steveha
Re:"the only real reason"? (Score:3, Interesting)
NVIDIA could register a trademark for their official Open Source driver build and disallow the use of the trademark on the builds which are modified. The Apache does it like that, modified versions aren't "Apache" anymore.
1) nVidia's drivers contain large amounts of software that is be
Re:Real Story... (Score:2)
They, and by a similar token, wireless network chip makers, are kind of counterpoint to the entire IC industry. Most semiconductor manufacturers freely give away information on how to use their product, even giving away free, non-obfuscated source code!
I really doubt the economic ramification
Closed Source Dirver -- Oper source system? (Score:2, Insightful)
NVidia on the otherhand is making money purly on hardware and drivers are a sunk cost. They have to be availible or thier cards won't sell, and they have to be good or their cards will
Re:Real Story... (Score:4, Interesting)
I'm getting very sick of astroturfers trying to push their marketing drivel (straight out of South Park: "closed source is gooood") at the start of slashdot replies.
By definition, for the customer (us!), open source must provide at least all the options of closed source. All the grandparent did was highlight what is probably the most beneficial potential change for slashdot'ers. If NVidia had released the source as that poster had suggested the 4K problem probably would've been fixed within hours.
---
It's wrong that an intellectual property creator should not be rewarded for their work.
It's equally wrong that an IP creator should be rewarded too many times for the one piece of work, for exactly the same reasons.
Reform IP law and stop the M$/RIAA abuse.
Re:Real Story...NOT INSIGHTFUL (Score:3, Insightful)
Economics is extremely complicated, and I assure you that it is more complicated than just the purchase price for a card at the store.
If you don't think losing trade secrets can change a business model for hardware, ask IBM about the early PCs and clones. They might have a sli
Re:Real Story...NOT INSIGHTFUL (Score:3, Informative)
Yes, the IBM PC XT was a complete POS that couldn't compete with anything else out at the time. Almost nobody used it, apart from a handful of people. That garbage computer didn't even include a decent sound system, for crying out loud!
Then the clones came.
And the XT architecture became popular.
And IBM sold more PCs than they ever thought possible.
Re:Real Story...NOT INSIGHTFUL (Score:2)
I'm curious to how you came to the conclusion that IBM wouldn't have improved their architecture without pressure from the clones. Sure, it would have been a slower improvement, but they very well could still have wound up with a significant market share.
Re:Real Story...NOT INSIGHTFUL (Score:3, Insightful)
Because there was also Apple. Also planned obselescence, etc.
Why do you think MS software sucks?
Interesting choice of company; it goes a fair distance to demonstrating my point. Would Windows be more secure today if competition had forced it to be throughout it's life? Sure. Would OSes be better in general? Almost certainly. Would MS have the market share it does today? No way.
Re:Real Story...NOT INSIGHTFUL (Score:5, Insightful)
Re:Real Story...NOT INSIGHTFUL (Score:5, Interesting)
Besides, what would 99.9% of linux people do even if it was open source? Download source, not even look at it, type make install clean, and be done with it. (Or make setup or whatever the build sequence is; point being that most users wouldn't care.) And for the 0.1% of people who do mess with it, unless they discovered some great tweak that would provide a significant feature or speed advantage over the NVidia drivers, I'd just go with them, since I trust them more since the quality of their drivers partially determines their sales, and thus they have a bigger motivation to make them better.
Re:Real Story...NOT INSIGHTFUL (Score:3, Insightful)
Re:Real Story...NOT INSIGHTFUL (Score:3, Informative)
But:
You really can't revise the nv drivers because they're compiled binaries. Nothing is stopping you from modifying them except the little thing that it's not accessible code to modify, since if it was this thread wouldn't have started.
The nv driver is not the official release from NVidia. It's a part of XFree86 (and now X.org), and is available under the MIT license
Re:Real Story...NOT INSIGHTFUL (Score:5, Insightful)
If you wanna say "here's our stand, and we stick by it", I respect that. If you say "any stand but ours is unholy and wrong", then you are attempting to control and I have no use fer ya.
I wouldn't violate the GPL, as a programmer I respect other coder's work and time. But I also don't buy into the demand that EVERYTHING be GPL's, or whatever license you prefer.
The world ain't black and white kiddies, time to realize the intelligent people have differing opinions most of the time...
Re:Real Story...NOT INSIGHTFUL (Score:3, Insightful)
Companies want to control their product in order to control markets. 'Zealots' want to limit any one company's control of markets to keep them open, and it works. Are you really implying the world and the state of computer technology would both be better had IBM retained monop
Re:Real Story...NOT INSIGHTFUL (Score:2)
OTOH, I would argue that IBM would likely have a better market share and sell more products had they retained monopoly control.
IBM isn't concerned about the world and state of computer technology; IBM is concerned about $REVENUE - $COSTS.
Similarily, NVidia wants to make the most money it can, and it thinks closed
Re:Real Story...NOT INSIGHTFUL (Score:5, Insightful)
But, the fact is that if IBM hadn't "goofed" and created a mostly open system, it's likely that either another more open system would have succeeded even though it had a lot of obvious fault or no system would have succeeded and the information age wouldn't be near the point it is. Why? Because a more open system allows for programmers, both hobbyist and capitalist, to more easily develop software for the system. This barrier to entry would mean less software overall which would directly decrease the demand for computers. At the same time, monopolistic control would keep prices high, fixing the quality sold at a smaller rate than it is today thanks to the vast number of clones.
So, it's unlikely IBM would have a better market share or sell more products. They might, still, be making more profit due to monopolistic pricing. It does seem unlikely for this to be the case, however, when various other architectures would have likely succeeded in IBM's place and relegated IBM computers into dinosaurs like the Amiga (no offense to the Amiga intended).
As for NVidia, there's at least two principle reasons why they might wish their drivers closed. The first is by closing the drivers they have stronger control over rebranding cards at different price points without modifying hardware which might increase sales without hurting sales on the higher priced cards. The second is NVidia has cross-licensed a variety of patents which probably puts them in the position of not having the authority to license said patentable idea under the GPL.
Without number two, number one could be fixed with creative hardware locking mechanisms. The total cost of such hardware locking would be minimal in comparison to the boosted sales of all the likely free porting and driver work done by volunteers on the NVidia driver. The fact is, NVidia is a hardware company so it is in their best interest to commoditize all software for their hardware to be run on. Open sourcing their driver, if possible, would very likely have this effect (it's hard to argue that it could have the reverse effect, at least).
The claim that trade secrets would somehow be revealed by open sourcing their driver is possible, but I would guess is unlikely as the majority of NVidia's actual trade secrets would be in *hardware*. All a driver is supposed to be is a standard interface for the OS, and if there are tasks beyond this in the driver NVidia would almost certainly advantage by sticking it in hardware as well. It's for this reason I assume NVidia's driver license policy is the main fault for them not open sourcing their driver.
Re:Real Story...NOT INSIGHTFUL (Score:2)
I would have expected to see maybe one more company rise up with a personal compu
Re:Real Story...NOT INSIGHTFUL (Score:3, Insightful)
Re:Real Story...NOT INSIGHTFUL (Score:3, Insightful)
"only in the last couple of decades"? what about the great big monopolistic empires of the late 19th Century? Standard Oil? United Steel? J P Morgan and Carnegie? The railroads? These are the reasons the original anti-trust laws were passed. (And before that, go back to the East India Company and th
Re:Real Story...NOT INSIGHTFUL (Score:2)
funny you didn't even mention IBM. shows how their market share really held out when they opened standards. From the "IBM compatable" to now just found under "ect."
Re:Real Story...NOT INSIGHTFUL (Score:2)
Who better to control the company's product than the company?
Zealots want to control the companies.
Can you provide any more insight or is this just one of those "salt in the wound" generalizations? I've no idea what this is supposed to mean, unless you expect me to be cynical.
Users want to know that their investment will be safe in the future, that the company they are depending on won't close shop or turn their back on them. C
Re:Real Story...NOT INSIGHTFUL (Score:2)
Re:Real Story...NOT INSIGHTFUL (Score:2)
You sound like you want us to listen to you because economics is "extremely complicated", yet you provide absolutely no insight into how economics works. Are we supposed to just glaze over at this point and forget what the issue was? What's your point?
Re:Real Story...NOT INSIGHTFUL (Score:3, Insightful)
True.
No, companies want to control their revenue sources.
No, zealots want to control freedom of their code and the code that is based on or extends it.
Re:Real Story...NOT INSIGHTFUL (Score:5, Informative)
Try to remember it this time, it's only the 400 millionth time it's been mentioned.
Re:Real Story... (Score:3, Insightful)
It's stupid and there is no "economic ramification"...the drivers are free, after all. They make their money selling cards!
How does it harm Linux? Even NVidia claims more fps under Linux than Windows or competing Operating Systems.
I am just glad that their are quality NVidia drivers available for end users. It doesn't matter much whether they want to keep their trade secrets to themselves or
Re:I agree (Score:4, Funny)
Mycroft
Re:I agree (Score:5, Funny)
Mycroft
Re:I agree (Score:5, Interesting)
But drivers is not one of them. Had they put the closed source code in a user mode library and used just a small open source kernel driver, we wouldn't have all the problems with the driver. It still wouldn't be optimal, but it would be way better than the current situation.
Re:I agree (Score:5, Interesting)
The rest is pretty much trolling, at this level. NVidia has been so far quite open source friendly when it comes to producing drivers. But I guess there will always be people to complain. Me, I'm happy NVidia has drivers for platforms where theirs is the only accelerated choice, like amd64. Others would say the same about IA64, or FreeBSD. Windows and Linux on x86 aren't the only games in town, you know.
Finally, how do you know they don't stand to lose something by making the drivers fully open source? look only at the whole 12 pipelines vs. 16 pipelines thing going on between the latest NV and ATI cards, with last minute info prompting new cards on both sides. If NVidia releases drivers for their last generation of cards that take the competition a couple of months to disassemble and analyze, they might keep the edge long enough to move on.
Re:I agree (Score:4, Informative)
Huh? There certainly is a difference between what I described and how the driver is currently working. Maybe my suggestion would require changes of the binary code and/or the X server, but it certainly should be possible.
The interface with the kernel is open source
That statement doesn't make any sense. You can say the interface is open, and you can say the kernel is open source. But an interface is something more abstract than a piece of code. Of course when talking about interfaces to the kernel it is important to keep in mind, that there are two different interfaces. There is the user/kernel mode interface. This API complies (mostly) with various standards: Posix, BSD, Single Unix specification, SysV. But the standards only specify the API, not the ABI which is Linux specific. This ABI is kept as stable as possible even across kernel versions. But this interface is not really important when discussing kernel modules. The functions kernel modules can link against may change, and no attempt is made to keep the ABI stable, only the API is kept stable within each major version as long as the API doesn't turn out to be a major problem. This API is however the same across multiple CPU architectures (unlike the user/kernel ABI discussed before). But this really means that if you want to ship a Linux kernel module, you have to ship it as source. Because it is only at the source level there is a well known interface. A fixed ABI is just not possible, just the differences between CPU architectures is enough to make it impossible, but in addition some data types in the kernel are different depending on the options. And finally there are stuff like the 4K stacks where the current macro had to be changed.
the closed source code is a binary object that gets linked into the module.
And that is a problem. Not only does it only work on one architecture, but it makes assumptions about the kernel, which may not be satisfied. The amount of stack space is not the only problem here. The code can break the kernel in various ways, which means you can no longer trust your kernel.
context switching to a different privilege level would only hurt performance.
Some years ago I did some meassurements of this on a computer, that is now five years old. It could do one million switches from user mode to kernel mode and back again per second. I believe newer machines can do a bit more than that. I guess very few people have a monitor refresh rate of more than 100Hz. That means you will have time for about 10000 switches. Of course you can't use all your CPU time just to be switching, but let's say you can do a single frame with less than 1000 switches, then you certainly wouldn't have a performance problem. And if more than 1000 switches are required to do a single frame, then you have a broken design that needs to be fixed. It is not the amount of data needing to be transfered that is a problem, because you could either map board memory directly into the user mode process, or (a litle more complicated) do DMA directly to user space. So I won't believe your talking about performance problems, until I see a proof that it can't be avoided.
NVidia has been so far quite open source friendly when it comes to producing drivers.
NVidia have not really been that friendly. They may seem friendly when compared to other 3Dgfx manufacturers. This really just means there is a market, where no vendor give a damn about their customers. I hope some vendor will realize this, because if they do, and make the product the customers want, then I believe they can make some money.
Re:I agree (Score:5, Informative)
That works great if you can guarantee separation. Otherwise debugging is a nightmare, knowing that there are some black boxes in your system which can manipulate the whole system.
Sorry, user mode doesn't really make much sense here, drivers need full hw access and context switching to a different privilege level would only hurt performance.
Right, that wouldn't work too good - but if everything runs in kernel mode then there is no border control between the driver and the rest of the kernel. The driver has to be trusted to play nice and not to fuck up the kernel data structures, because there's nothing that can stop him doing that. It would be different if the driver ran in user mode, because then the driver would throw segmentation faults and the like if it does something illegal.
The conclusion is that source code should be available for everything that runs in kernel mode.
Re:I agree (Score:3, Informative)
Companies are in business for the money. Love it, hate it, that's the goal. If keeping code closed brings them an advantage, most will do it. If opening the code brings the advantage
Re:I agree (Score:3, Insightful)
Re:I agree (Score:4, Insightful)
Re:I agree (Score:4, Insightful)
I have been arguing this for years. Part of "Freedom" is choice, and having the choice to release your source code or not, just as I have the choice to use open or closed source applications. Abuse of a monopoly is not the same thing as closed source.
It is ironic that some (but not most) of the advocates of Open Source rail against anything that is not Free. This intolorance is why they get compared to "commies" and socialists, taking a position that "either software is Free or it should not exist". Fortunately, most of us who are Free software fans don't share their intolorant views.
If a company wants to keep their source closed and try to actually make money SELLING it, fine. If someone wants to make a Free version that does basically the same thing, even better, because then we have a choice, and the MARKETPLACE decides.
Re:Real Story... (Score:2)
Wow support for 4k stacks!!! (Score:5, Funny)
Re:Wow support for 4k stacks!!! (Score:5, Funny)
4 K is very cold.
A stack is a collection of pancakes.
Therefore we're talking about frozen pancakes.
In other words, I have no idea.
Re:Wow support for 4k stacks!!! (Score:5, Funny)
Re:Wow support for 4k stacks!!! (Score:5, Informative)
Re:Wow support for 4k stacks!!! (Score:5, Informative)
You can turn off the 4KB stack and go back to the default behavior by recompiling the kernel with the proper option set, but default Linux distros based on 2.6 all use (to the best of my knowledge) 4KB stacks by default.
Re:Wow support for 4k stacks!!! (Score:4, Funny)
Re:Wow support for 4k stacks!!! (Score:2, Interesting)
Re:Wow support for 4k stacks!!! (Score:5, Informative)
Alpha: 8
Sparc64: 8
Itanium: 4, 8, 16, 32, or 64 (usually set to 16)
You can always double-up in software. The VAX has
1/2 kB pages (512 bytes), but the Linux port puts
8 of those together to make a 4 kB page.
The 680x0 processor lets the OS choose the page
size to be pretty much anything.
Re:Wow support for 4k stacks!!! (Score:5, Informative)
When you make a system call, it typically executes on its own stack, separate from the one you get for user state. The question is, how big should that stack be? It constrains how deeply nested you can get into function calls while in system state and how much space they can chew up for local variables. Until recently on Linux it's been 8K bytes (think 8192 plates), but they switched over to 4K, only half as much space (or half as many plates).
Some drivers as written count on having that whole 8K of space to play with, and they have to be rewritten. Since nvidia provides neither an Open Source driver nor sufficient information to allow anyone else to write one, however, it means that we have to wait until they deign to make that change. Fortunately, they've gotten around to it.
Re:Wow support for 4k stacks!!! (Score:5, Informative)
The problems with interrupts is, that you don't have much control over when they arrive, and when they do arrive, they need stack space. So with interrupts interrupting each other, you can quickly use a lot of stack space. If you were very unlucky, you could probably overflow even a 16KB stack that way.
So you would either have to disable interrupts or make sure there were always enough stack space to take an interrupt. Disabling interrupts is something we don't want to do for more than a few nanoseconds, so something have to be done.
With 4KB stacks this problem become even worse, but there is a solution. Assume we need to be able to handle for example five interrupts at the same time and each of them need 3KB of stack space. With the traditional approach, we would need to always leave 15KB of stack space in every thread. But we are never going to need all of that, because at any time there is only one thread executing on each CPU.
Interrupt stacks means that rather than using the stack of the current thread, we simply switch the stack pointer to a different stack only used for interrupts. We will still use a small amount of stack space in the current thread, but certainly less than 100 bytes, and only for the first interrupt. This means that the thread stack no longer needs to leave free space for some unpredictable amount of data.
The kernel design requires the kernel stack of every thread to have exactly the same size (and a power of two). The current macro on x86 is one piece of code relying on this. But within an interrupt current doesn't make any sense. So it should be possible to make the interrupt stacks larger than the thread stacks. That way we can have a few large interrupt stacks and a lot of small thread stacks. This use less memory than a lot of large thread stacks. The number of thread stacks just have to be one pr CPU or one pr handler depending on your design.
My system currently have 1 CPU, 12 interrupt handlers, and 101 threads. Which means that saving 4KB per thread and then creating a single 16KB interrupt stack would save a lot of memory.
mem=nopentium (Score:5, Funny)
Yippee!!! (Score:5, Funny)
Re:Yippee!!! (Score:2, Informative)
I know ET Pro is a addon for ET, but it seems like every server uses it.
The Best Test (Score:5, Informative)
One of the best online FPS games and it's free-as-in-beer.
Keep up the good work NVIDIA.
Re:The Best Test (Score:5, Funny)
Or, if you are bound and determined to hear "I need a medic!" as you drift off to sleep everynight, at least ease yourself into it. I hear that "crack addict" is a good introductory stage to ET addiction.
Re:The Best Test (Score:2)
And Enemy Territory it is!!!
Re:The Best Test (Score:2)
OpenGL header files problem (Score:5, Informative)
Re:OpenGL header files problem (Score:3, Insightful)
nvidia specific features ?
Both the OpenGL API and ABI(on linux) are standardized, so it doesn't matter whose headers you use, as long as they are for the OpenGL version you want to use.
The beta drivers worked well (Score:5, Informative)
Thad
This is a major release (Score:5, Informative)
Who still makes truly open drivers? (Score:2, Informative)
Re:Who still makes truly open drivers? (Score:5, Informative)
Are there any video card manufacturers left who release other than binary only drivers?
Matrox releases open-source drivers for some of their product lines (e.g. the Millenium G series -- G400, G450, G550, etc.). The mga driver that comes along with X is the same as Matrox's, for that reason. And 2D performance under the open-sourced Matrox drivers is actually pretty damned good. This all sounds great, doesn't it? Unfortunately, Matrox's Linux support sucks, and the support for Matrox from the DRI project is fairly nonexistent right now. So if you do have any problems with the driver, or want to get 3D/DRI/hardware acceleration issues solved, you're gonna have to learn to hack the drivers/kernel modules yourself. Good luck.
Re:Who still makes truly open drivers? (Score:2, Informative)
You could still use the old nVidia drivers (Score:5, Informative)
4k stacks are a good thing, a first step for Linux to support an insane amount of simultaneous processes on the system.
What about the source code? (Score:2, Insightful)
I don't know about you guys, but I think having the source code to recompile it manually would help out immensely.
It's funny when you think about why hardware companies is they like to keep the source code secret (i.e. you only get the drivers). If they claim that someone may use it for some unfit purpose then the question is, if someone has the source code without the hardware isn't it inherently useles
Re:What about the source code? (Score:5, Insightful)
That's funny, I don't.
First, fixing this stack size problem is not a simple re-compile of the same code. Depending on how the driver is written this is certainly a non-trivial task.
Second, even if you had the source that does not mean that you could distribute a fixed version. Open source != Free Software.
Third, they may be closed source drivers but they are miles ahead of the current FOSS drivers. The Zealots can run their "pure" systems and suffer graphics glitches and poor 3d performance. I'd rather just use something that works. If that meant sticking with by old kernel a bit longer then so be it.
they just don't want to fork it over because somehow you may "magically" make the component up yourself out of basement and not have to buy it.
Not you - their competition. ATI has always been plauged by crap drivers. If ATI had a peak into how NVidia does it you can be sure they'd take something away from it. NVidia would lose a competitive advantage. The GPU war is nasty. The competition is killer - they'll take any advantage they can get.
Wouldn't matter (Score:3, Insightful)
It actually slows game performance in many cases. Games are written for consumer cards, not pro ones, and what is good for pro apps isn't always good for consumer apps. Hacking your card to look like a Quadro (or getting a real one) won't make your shit run faster, if your shit is games.
More importantly, GeForces aren't certified by pro companies. This is important if yo
Further Testing (Score:3, Interesting)
NVIDIA is still impressive with its Linux drivers! (Score:2, Informative)
32bit OpenGL support on AMD64... (Score:2, Informative)
Btw, that was done for DRI drivers quite a while ago - talk about the usefulness of having access to the source code. And no, they aren't that useless - you can still play UT2004 with them, although it won't look as good(and I didn't notice much difference, except for performance, in ET(btw, for some reason, my FX5200 is _way_ slower while playing on radar/batt
PPCP (PowerPC Please) (Score:5, Insightful)
It's not like nobody can do it... [apple.com]
Thank you.
Re:PPCP (PowerPC Please) (Score:2)
Re:PPCP (PowerPC Please) (Score:5, Funny)
compile those drivers for us PowerPC owners who also pay for the cards?"
Oooo ooo ooo can I be the first to be modded up for saying "Just another reason to switch to Windows!"..?
Re:PPCP (PowerPC Please) (Score:4, Insightful)
Hence Linux support is kind of thin at this point, it's just a smaller market than Windows. However some people, like nVidia, fell that there is enough to warrant writing drivers for, to increase sales. Remember: This is a company, they don't do thing for the good of humanity, they do things to make money.
So let's take the Mac now, being the only real PPC platform that would use nVidia cards. What percentage of computers are Macs is something of dispute, but it's between 3-5%. Well then you consider that most Mac users don't run Linux. It's VERY rare, in fact, since one of the reasons most Mac users buy Macs is for MacOS. It is certianly under 5%, and probably under 1%.
So, even using optimistic numbers, you are talking 0.25% of the market, and realisticly it's probably more like 0.05% or less.
Now on top of that, second hand sales of Mac graphics cards are pretty low. Since they are special, and aren't compatible with normal off-the-shelf PC cards, you don't see a lot of them sold. What you buy with a Mac is what you have for the life of that Mac in most cases. Well, that means there isn't a big incentive to get you to switch to nVidia cards. You either got one with the Mac, or you didn't. You aren't likely to change later so no profit motive for nVidia.
So you have a very small percentage of computer users that aren't likely to change cards after purchase, that use a different processor architecutre (and hence require more programming and testing). Not really a ripe market for a driver port.
You have to understand that the x86 Linux market is populated by a high number of DIY computer builders. Those people can, and are, swayed to certian hardware by availibility of non-suck drivers. Thus it is in nVidia's financial intrest to make drivers for them, though they are a small market segment. The PPC Linux market is not capable of DYIing and is less likely to change to a new card because of it. Also, it is a much smaller market. thus it is NOT in nVidia's financial intrest to make a driver for it.
When you deal with corperations, at least ones of any deceant size, you always have to remember that it is money that they care about, not humanity. They do things because they make them money, or get them good press, which leads to more money. Not because those things are for the good of humanity.
ATI (Score:5, Informative)
It took 2 third party patches and a recompile to get it their driver to install on Fedora Core 2, and it still crashes WineX.
Could NVIDIA finally,slowly be getting it? (Score:5, Interesting)
I must admit-I am a bit suprised that SLASHDOT didn't pick up on it. It might just be a little insignificant thing which doesn't warrant much attention anyway-who knows. Of course everyone is mentioning the support for 4k stacks. And of course this is important. Anyone who has used Andrew Morton's patch set knows what a PITA this issue was. But nvidia even did more than fix the single most blocking issue regarding their drivers and the 2.6.x kernels.
They also:
Added support for ACPI
Fixed problem that prevented 32-bit kernel driver from running on certain AMD64 CPUs.
Added support for GLSL (OpenGL Shading Language).
along with the new nvidia-settings utility-GPL'ed and written in GTK2....
and finally they added:
Added a new Xv adaptor on GeForce4 and GeForce FX which uses the 3D engine to do Xv PutImage requests.
Now I am not an expert on such things-25 years of experience and I am still left asking more questions than my ability to answer. _But I noticed this little innocuous "xv" thing and was like WOW-cool. I leave it up to those who know more to shoot me down-but doesn't this little "xv" thing mean that all those Linux users who use nvidia GeForce4 and FX cards suddenly got a a tremendous boost when doing much of anything with video ? After all XV is what all of the video players under Linux use for good quality full-screen video(mplayer, xine, totem, gxine, helixplayer etc.)
Now if I understand this correctly everytime a PutImage() request comes along under XV this is handed over to the 3D engine-automatically. It seems as if this would be a very, very significant reduction in CPU usage-particularly for older generation(PII/PIII) machines which happen to have fairly modern graphic cards. Full-screen divx under mplayer with the new drivers uses 12% CPU on average on my machine-I unfortunately did not do a benchmark to test this-but if my memory serves me correctly this is significantly less than what is was with the older drivers.
Now the downside to this-at least for the time being- is that some apps don't quite work with these new changes-Xine-and it's siblings(totem,gxine, kxine etc.)
But I assume these will be fixed pronto.
Well where am I going qoing with this train of thought:
Putting this kind of support for XV in the NVIDIA drivers -is really simple for the NVIDIA guys-perhaps even trivial-but it can mean a tremendous improvement for the users of these cards. NVIDIA has always treated Linux like a second class citizen-but hey who can complain-at least they acknowledge that Linux exists-compared to the BSD's Linux support is great-of course only if you are using x86 CPU's. Now everyone knows that the graphic workstation market has all but disappeared. But what if NVIDIA was to decide to simply really take advantage of the X11 windowing system and it's features.
Imagine if NVIDIA would actually provide good RENDER support-wow what a difference that would make for 2D desktop support-particularly under GNOME which uses RENDER extensively in VTE/PANGO-ie. why text scrolling in gnome-terminal is so abysmal. I am still stumped by the fact that the open-source X11 nvidia drivers support RENDEr far, far better than NVIDIA's own in-house drivers.....
Imagine if NVIDIA would really support the libfixes, libdamages and libcomposite extensions which are currently being developed at Xorg-X11. Sun's Looking Glass is already using libdamages and libfixes-I got it up and running on my machine yesterday-and yes it is still pre-alpha-but I have never, ever seen such a fluid desktop environment. This tech is almost *evil*- the promise which it presents is simply baffling-rendering all previous X11 windowing experiences to the days of the stone age. I don't really care that much about Looking Glass-if NVIDIA properly supports the X11 extensions we will have cairo-enabled desktops inside of the next year which will fundamentally alter the X11 experience for X users.
Ok. So here is the point of this little essay: If NVIDIA would simpl
Re:Could NVIDIA finally,slowly be getting it? (Score:3, Informative)
Snipped from the driver's README:
Personally I haven't noticed any difference, but then I've got some AGP issues, so YM
Re:What about ATI? (Score:3, Informative)
Not all is perfect (Score:4, Interesting)
A problem that leaves the console framebuffer blank after X is started remains. You need to work around it by adding
Option "IgnoreDisplayDevices" "TV"
to your xorg.conf. If you are actually using TV out, this could be a bit annoying.
Even worse, it hasn't been more than 24 hours since I've installed them, and these drivers have already hung X twice. When an OpenGL process segfaulted, that process assumes state D (uninteruptable sleep), and becomes completely unkillable, along with X itself. I haven't figured out how to reboot cleanly once this happens. All I can do is ssh in, sync the disks, and hit the power button.
why NVidia should open source their drivers (Score:3, Informative)
Right now, I have to "dual-boot" my X depending on whether I want good RENDER performance or want to run OpenGL stuff. My webpage has a theme I really like that my boyfriend made. The background is an animated GIF of rain falling. I'll get 100% CPU usage on my Athlon XP 1400+ and my browser will become practically unresponsive using the "nvidia" driver, but when I switch over to the open source "nv" driver, it does maybe 15% CPU usage -- just like in Windows.
Mesa as absolutely unacceptable for doing 3D graphics. Even a simple shooter I'm working on called "Blammo" for the time being will chug to about 5 fps under "nv."
Now, if only we could bring the features of the "nv" driver and the "nvidia" driver together.
I think the main problem with "linux being ready for the desktop" (as though it isn't -- all that linux really lacks is the ability to twist the arms of OEMs) is that if you want to use certain hardware, you can't get optimal drivers. This is, of course, a vicious circle, because NVidia could fix the problem I have in the "nvidia" driver tomorrow if they wanted, but they won't, because the target market is too small to waste their time.
I might be willing to pay $300 for a brand-spanking-new ubervideocard once the X drivers get fixed, but there are also about 300 other people willing to do the same so long as the Windows drivers stay working.
Perhaps the solution therefore is to change the liscense on the "nv" driver so that NVidia can use the code that's already out there. It makes the authors of the "nv" driver saints, and NVidia stays an evil corporation, and I get Windows-like performance out of my hardware in X, and everyone's happy.
Why was this modded "Insightful"? (Score:2)
You added nothing to the thread.
Other than you're just as much an anit-Linux zealot as any Linux zealot is.
Re:Why was this modded "Insightful"? (Score:2)
For the lazy... (Score:5, Informative)
If you allocate memory in 8k stacks, the kernel's got to find 2 pages of memory together. Which I guess gets to be a pain as uptime increases. Since memory pages on most hardware are 4k, it's easy as pie with 4k stacks. Plus, you separate some of the kernel stuff like software interrupt handlers to their own stack (I think that's what it was), hopefully making the system more stable in the process.
Give credit where credit is due... (Score:5, Informative)
Re:a 4k stack is- (Score:2)
It seems to me that this problem could be avoided simply by always allocating pages in pairs (yes, slightly more wasteful of memory, I know).
Alternatively, a variable page size (assigned at boot
Wonder why... (Score:5, Interesting)
Re:Wonder why... (Score:2, Insightful)
I'd understand, if you complained about an important story being not published at all - but in every case the story still gets accepted from someone else, so all is fine. Someone else got the credit for submission?
Who fucking CARES?? I'm fed up with whining about having your stories rejected. These comments are now in almost every story, visible even with the "trolls off" setting.
Re:Wonder why... (Score:5, Funny)
I think you mean "retarded".
Re:aahhh finally (Score:4, Informative)
They are still ahead of the game with Linux compared to ATI. ATI only just got Linux drivers out a few months ago. NVidia has had Linux drivers for at least around 2-3 years now (I didn't really care about it before then), this is just about them getting the 2.6 kernel drivers (and new chipsets). Also, to my understanding, ATI's Linux drivers arn't all that good, and they have yet to support the 2.6 kernel.
So really, if you want a brand name video card that supports Linux, NVidia is the way to go (at least for now).
Re:aahhh finally (Score:2)
barnes@barnes-t40 barnes $ fglrxinfo
display:
OpenGL vendor string: ATI Technologies Inc.
OpenGL renderer string: MOBILITY RADEON 9000 DDR Generic
OpenGL version string: 1.3 (X4.3.0-3.9.0)
barnes@barnes-t40 barnes $ uname -sr
Linux 2.6.7-gentoo-r7
So ATI's drivers do work with 2.6.x (at least if you use Gentoo; I haven't tried the official package). However, I agree that NVidia's drivers tend to work better.
Re:aahhh finally (Score:4, Interesting)
Man, I just installed these drivers (I was wanting a good excuse to do it, I admit it) on my ancient TNT2 video card, 800mhz Duron, blahdy-blah blah, and now Metisse is running fine. Before, with the nv driver Metisse barely ran. Amazing how much a difference the driver makes. ;)
Re:Kernel Stacks... (Score:3, Insightful)
be sure to send a letter to the company explaining why you are giving them your business.
Otherwise the purchases made becasue of Linux will go unnoticed.
Re:Kernel Stacks... (Score:3, Informative)
Re:Excellent News (Score:2)
Re:Excellent News (Score:2)
Re:Linux newbies experiences on this issue (Score:3, Informative)
Re:question on video in general (Score:3, Insightful)
I hope the moderators mod this up as interesting, because it certainly is. Very, very naive, but interesting none the less. Such a question implies that you are either a teenager or an ivory tower researcher ;-) I hope you manage to keep your idealism.
The system you describe is a distributed operating system. Your hypothetical system has been contemplated by many researchers, perhaps most famously with AT&T's Plan 9 [bell-labs.com]. The problem with this, as with all other distributed operating systems, is that is sti