A Brief History Of NVIDIA And SEGA 80
Alan writes: "FiringSquad just posted an article on the history of NVIDIA. What makes this interesting is that they include a little bit about the NV2 chip which was developed originally for the Dreamcast. It was using quadratic texture maps (a derivative of NURBS) rather than polygons! The article is over here."
Re:turnabout (Score:1)
The PowerVR PCX2 was the highest-profile chip from PowerVR (an alliance between VideoLogic and NEC), which shipped mid-late 1997 as an add-on solution from VideoLogic and Matrox. The cards had 4MB of RAM on them, but that was all texture memory as it was able to share the framebuffer with the 2D card. That also enabled it to run 3D in a window, something the Voodoo1 couldn't do. Its MiniGL drivers for Quake-based games were really good and put them on par with a Voodoo1, but their Direct3D drivers stunk. Power SGL games ran great, but they were very few and far between, and the most high-profile one was probably Unreal. The PowerVR Second Generation chip (PVRSG) was supposed to ship mid-late 1998, but it kept getting delayed until it finally appeared on the Dead-On-Arrival Neon 250 around the end of '99.
MOD THIS UP (Score:2)
I don't know about NVIDIA being a monopolist like Microsoft, but he's sure on to something that has some insight. Look, the HW/SW graphics development areana is only controlled by a few seledct vendors and now that hold is merging between the console industry and the desktop industry. As a consumer i want as many choices as I can get and if NVIDIA becomes the top playah with their rep, it will surely drive competition down and prices up.
I don't thing the whole console market would fall but I would sure like to get the next gen console without having to pay upwards of $350 US to get it.
The dark lining to this silver cloud. (Score:3)
While to some extent that's true, in practice you'd get your arse sued off until you could prove in court that you really *weren't* violating any of nVidia's implementation patents (or anyone else's). There were a few sabre-rattling sessions last year in this vein.
There are only a few straightforward approaches to building any given part of a graphics pipeline. Just about all of these are patented (usually preemptively) by the big graphics card companies. It's not as bad as the software patent arena, but it's still not nice.
If you have lots of money, you can hold out long enough that the big companies will offer to cross-licence technology with you. Otherwise, you'd better pray that they don't consider you a threat.
Perhaps it's not quite this bad, but you'd have to do quite a bit of patent research to avoid stepping on anyone's toes.
Lastly, a nit-pick re. AGP, PCI, and USB. For some of these, you have to pay licensing fees to get the specification manual. For all, if I understand correctly, you have to pay licensing fees to build any hardware that talks to them. The standards bodies are money-driven too.
In practice, the cost will be low compared to the cost of the rest of your card, but it's still there.
Re:radeon? (Score:1)
Re:a scary union (Score:1)
Re:counsoles vs PC's (Score:1)
You will never have a console game that looks better than a computer counterpart.. consoles just can't keep up with the dev cycle of computers.
Sure, they might look good for 4-6 months, maybe a year after release.. but computers will always surpass them (and consoles are usually around for 4+ years..)
In fact, when quake2 was being ported over to the psx and n64, they had to
Plus there's the addons that you can't use on a console version of a comp game.. no hope of adding user made maps, mods, player models, etc.
And on top of that, you can't play the same game on the latest console.. they're all incompatible (The same way comps were back in the early 80's)..
If I wanna play quake1, 2 and 3.. then I need quake 1 and 2 for psx or n64 and then 3 for my dc.. If I was doing the same thing with my comp then I'd just need to install them and start playing.
hrm.. this is too offtopic as it is..
-since when did 'MTV' stand for Real World Television instead of MUSIC television?
Quadratic Texture maps are NURBS? (Score:4)
I'm sorry but this makes no sense. NV2 chip used qudratic surfaces and not quadratic texture maps. This is like comparing apples and oranges.
Re:a scary union (Score:1)
Re:Quadratic surfaces (Score:2)
Dont bash curved surfases just becouse patches doesnt do what you want them to.
Sub division really is the future, maybe it is too early to implement in a game today. But it is defenetly possible to implement a engine based on subdivision surfaces and get goor results. I have done it.
http://verse.sourceforge.net/
Re:a scary union (Score:1)
Unfortunately I have to say that nVidia has cleaned up in the consumer market. Matrox has religated itself to building solid business cards (which is why the g450 ended up being slower than the g400) and ATI has completely lost in my mind because thier drivers suck.
With the loss of 3dfx as a resonable competitor I would have to say that nVidia has a virtual monopoly in home gamer 3d graphics arena. I just hope they don't stop building significantly better cards for that arena just because they are able to make money at the status quo.
Re:Not ready for prime time (Score:1)
I've been in the PC industry for many years, so I am quite aware of the market. I bought my first Nvidia card quite cautiously (the Riva TNT), and installed it in a machine which sat right beside a machine with a Voodoo 3 2000 inside. The differences in graphic performance was minor, but the Riva TNT had drivers which were updated more frequently, seemed more stable, and the card was cheaper. This process has been repeated with the Riva TNT 2, Voodoo 3 3000, the Voodoo 3500 TV, Voodoo 5500, and various incarnations of the Nvidia GeForce/GeForce 2, both in my personal computers and in the machines assembled for customers. Starting towards the middle of last year, most of our more knowledgeable customers weren't buying the Voodoo series, but were asking for cards in the Nvidia series.
Plain and simple: Nvidia conquered 3Dfx by better pricing, equal or better technology, and a more rapid development cycle. No one expected it, any more than anyone expected AMD to ever better Intel.
This happens in software as well:
Wordstar, WordPerfect, Word.
VisiCalc, Lotus 1-2-3, Excel.
Dbase, FoxPro, Clipper, Access.
Christ, it was amazing when CP/M was dethroned by MS-DOS, or when CD's bested LP's or tapes.
The technology doesn't have to be better (Motorola versus Intel, VHS versus Beta), but Nvidia was at least the equal, and arguably better than, 3Dfx. It wasn't a matter of buying "the little guy, re-lable the IP and sell[ing] it as their own."
Re:a scary union (Score:1)
Re:Nvidia and work ethic. (Score:2)
-----------------------
Re:NV20? (Score:1)
How do you know? Have you been tweaking around with a prototype? I'd sure like to hear NVidia's announcement on the NV20 being named and released, or at least know when to expect it to be released.
Re:Quadratic surfaces (Score:1)
Er, games of subdivision? I think the word you're looking for is "therewith".
OK, I'll fuck off now.
Re:NV20? (Score:1)
Re:Quadratic surfaces (Score:1)
I know, you didn't turn up at GTS....
The bad thing is that Hardware vendors are trying to come up whit hardware that can do nurbs and other patch surfaces that basically suck instead of creating some thing general.
All points on a NURBS/Loop/Butterfly/catmull-clark surface are basically just weighted of a number of CVs in the control mesh. This means that you can for each vertex create a list on CVs that it depends on and quickly re-shape the curved mesh if the control CVs are moved. This could be done in hardware, and would really be a easy way to accelerate all types of tessellated surfaces (even polygon reduced meshes).
I have implemented this in my experimental api: www.obsession.se/ngl
Re:Quadratic surfaces (Score:2)
I don't think it's bashing it. Curved surfaces have their problems too, like they don't add detail. Curviness isn't detail, it's smoothness, and even though it looks a lot better with low-poly models most of the time, subdivision doesn't add much to a model over 8000 polys. Very often you see yourself adding polygons that are not smooth, but tend to have detail (hard edges) in them.
John Carmack pointed out the texturing problems, but there are loads more, like level of detail, texture coordinates and material seams. Triangles are about as stupid primitive that you can have as a graphics primitive, but they are extremely convenient. For example, their bounding volume is bound by their control points, they are discrete (i.e. can be detached and manipulated individually) and they can be textured in any way possible.
I have the feeling that most graphics primitives (such as curved surfaces, maybe not including subdivision surfaces) are going to lose from triangles.
Put it simply: complexity of geometry tended to be a problem in 3D graphics, but with the extreme bandwidth and pipeline increases we see today, the stream of triangles is becoming less of a problem. A good LOD algorithm can get the triangles/pixel ratio down to around 1 which will be a very feasable ration in a very short while.
Most of the added processing time nowadays goes towards shadowing, detail maps, projective lights, motion blurring/oversampling/other frambuffer tricks, but it seems the triangle limit is diminishing.
All of these new techniques get easier with triangles. With adaptive subdivision these techniques get worse to implement. Try to do animated soft-skinned subdivision surfaces with adaptive subdivision and shadow-casting.
Triangles are stupid, but in the end, all graphics cards draw triangles. I think that curved surfaces won't be all what they were hyped up to be, they simply add too little at too much cost.
Michiel
Re:Nvidia and new aliances (Score:2)
Just as in the workstation market, Sun was able to muscle-out SGI by steadily raising the bar and focusing on good-enough-for-my-core-markets improvements, so NVIDIA beat 3dfx. Although one could always argue that 3dfx beat itself. Still, in my mind, the turning point was when NVIDIA hired a lot of key engineers fleeing SGI which was in the process of spiralling downward. If 3dfx had gotten that team, we might be looking at a different story.
In any event, it's remarkable that engineers from either Sun or SGI were able to shift their mindset from product cycles of 3 years (workstation graphics) to product cycles of 6 months (PC 2D graphics a la Cirrus Logic). If you've grown up building projects the slow way, a faster tempo is not easy to adjust to! I've always thought that besides his brilliance, the psychological development of John Carmack's early exposure to 1 month product cycles helped him learn the focus needed to keep his current 18-month(?) product cycles going, something his competitors have really struggled to match.
It'll be interesting to see if the lack of discipline and business focus so evident within SGI engineering begins reasserting itself at NVIDIA as pressures on them ease and the early SGI engineers look at the comfortable value of their options which should be fairly vested by now. As a consumer, all I can say is that thank goodness ATI is around to keep the pressure on them.
People matter. There's an interesting book about how the 3D hardware market was won and lost just waiting to be written.
--Greg
Re:Nvidia and new aliances (Score:3)
No pal, you're wrong (Score:1)
Just my 2 cents.
Re:a scary union (Score:1)
Re:turnabout (Score:2)
Not at all. Quadratic surfaces were a fundamentally better way of explaining what shape something was in three dimensions. The tile based rendering still has polygons, but breaks the screen up into tiles to do the final render. IIRC the optimisation runs something along the lines of "is this tile entirely covered by something, all of which has a 'nearer' Z? if so, bugger it, I won't even try".
They scale particularly well, BTW. The PowerVR chip was designed to minimise bandwidth and be scaleable to 16 ways for arcade machines. The tiles glue together in a kind of smarter SLI fashion.
There were also early PVR1 accelerators around that rendered to a lump of memory for the 2d card to then actually display. Obviously a bit flawed for today's framerates but a perfectly valid option at the time. Again, driver shiteness killed these off and I have a suspicion that NEC/Videologic signed an exclusive deal with Sega for the PVR2.
I hope they can get PVR2 accelerators together for the PC. Toms' benchmarks give the impression of an altogether more interesting approach than the Nvidia 'hit it with more grunt' thing that's oging on at the moment - and potentially a greater advantage as the number of vertices increases over sheer fillrate requirements.
Still. I'm relatively happy with a voodoo3, so don't look at me.
Dave
Re:a scary union (Score:1)
Although it would be kind of nice to get a free console in the mail once a month :P Unfortunately it will only cost $200 a month to get it activated and you will have to download your games from the net every time you want to play.
They will also put rotating banner ad's at the top of the games and a little display that sucks your mind out in the control, and if you call thier 1-800 number they will install a device in your wallet that automatically sends all your money to them.
WTF?? (Score:2)
He's probably talking about these cards having hardware support for quadric surfaces.. but since its been fairly-much decided that its generally faster to decompose these kinds of surfaces to triangular meshes for rendering,especially in pipelined graphics architectures such as OpenGL, features like this don't get used much.
Raytracers like POV-Ray and others evaluate quadrics and other classes of curved surfaces on a pixel-by-pixel basis, and other renderers subdivide them down to sub-pixel sized triangles for rendering. However, this approach is seldom suitable for realtime application.
Most likely, this 'quadratic texture support' is just hardware implementation of OpenGL evaluator functions which simply assist in the decomposition of quadric surfaces to a triangle meshes.
Get a clue michael.
Re:Nvidia and new aliances (Score:1)
That's stupid and doesn't make any sense at all. SGI open sourced the OpenGL reference implementation and GLX. Smart money is on the AGP code that nvidia licensed.
Of course, none of this explains why they won't released the fucking info needed to write an independent Mesa + DRI driver for their cards. Their drivers are great, but an open source one would be even better. Since we can't really expect them to release the source for their drivers, releasing the programming info would be a reasonable substitute.
Re:Nvidia and work ethic. (Score:2)
-----------------------
Re:Nvidia and work ethic. (Score:1)
------
Re:NV20? (Score:2)
Quadratic surfaces (Score:5)
Er, no.
For several years now, Nvidia has been kicking ass like no other graphics company, but lets not romanticize the early days. The NV1 sucked bad, and it would have been damaging to the development of 3D accelerators if it had gotten more widespread success. Microsoft did a good thing by standing firm against Nvidia's pressure to add quadratic surfaces to the initial version of D3D.
There is an intuitive notion that curved surfaces are "better" than triangles, because it takes lots of triangles to aproximate a curved surface.
In their most general form, they can be degenerated to perform the same functions as triangles, just at a huge waste in specification traffic.
Unfortunately, there have been a long string of products that miss the "most general form" part, and implement some form of patch surface that requires textures to be aligned with the patch isoparms. This seems to stem from a background in 2D graphics, where the natural progression from sliding sprites around goes to scaling them, then rotating them, then projecting them, then curving them.
3DO did it. Saturn did it. NV1 did it. Some people are probably working on displacement mapping schemes right now that are making the same mistake.
Without the ability to separate the texturing from the geometry, you can't clip any geometry in a general way (not even mentioning the fact that clipping a curve along anything but an isoparm will raise it's order), and you either live with texel density varying wildly and degenerating to points, or you have texture seams between every change in density. No ability to rotate a texture on a surface, project a texture across multiple surfaces, etc. You can't replace the generality if a triangle with primitives like that.
Even aside from the theoretical issues, NV1 didn't have any form of hidden surface removal, and the curve subdivision didn't stitch, frustum clip or do perspective. It was a gimmick, not a tool.
All water under the bridge now, of course. NV20 rocks.
John Carmack
Re:a scary union (definition of the word monopoly) (Score:2)
I already mentioned that they were not the only company, just the only one with much power in the market. I don't put nVidia down for it, just wish other companies would jump in and make a new good product.
HELLO NERDS (Score:1)
Re:NV20? (Score:3)
Video card makers actually ASK HIM what he wants to see in the next generation video cards, then scurry off and figure out how to do that. I don't doubt that they send him a few engineering samples once they have working prototypes.
No company is so suicidal that they are going to create a video card that won't run Doom2001 (or whatever its going to be called) Perfectly.
Re:Quadratic surfaces (Score:3)
I'm not a big booster of hardware curves in any form, but I only rail against hardware schemes that use aligned textures.
John Carmack
Not ready for prime time (Score:1)
In a word. . . Bullshit
Heres why. First the product development efforts was less than agressive and the execution of the implementation was so poor that most mods were never even tested. NVIDIA is one of those companies that never really understood that the eighties were over. Leverage buyouts of other peoples IP based solely on future profits is a recipie for disaster. The GeForce2 efforts are not ready for delivery to Toshiba and Toshiba is pissed. Unfortunately Toshiba is on the hook for delivery and the'r going to be stuk until NV20. The X offering is only a pipedream at this point.
Look, efforts like this are based upon the waiting for little companies to develop a similar offering and then they buy up the little guy, re-lable the IP and sell it as their own. We in the end are the ones who get screwed because product hits the street late and is not ready for prime time.
Your argument is nonsense. (Score:1)
Unfair anti-MS bias (Score:2)
This is nonsense and self-contradicting. Microsoft's direct3d didn't kill nv1, a closed source proprietary texture mapping procedure that was incompatible with any other card killed nv1. Had nvidia worked closer with developers before direct3d became the standard, nv1 could have been accepted and implemented on a larger scale. True, direct3d replaced nv1, but blaming microsoft for the death is pointless: proprietary policies killed nv1, not its competition.
Diamond/Sega PC Card (Score:1)
a scary union (Score:1)
Nvidia is a known monopolist. No one can touch them in the graphics field now. They are surprisingly simalar to Microsoft, except there is no equivalent Apple in the graphics field to provide a counterpoint.
Sega almost reached the levels of monopolist in the console industry, but fell short. If they work with Nvidia, it's conceivable the console market would fall to this conglomeration.
Is this really what we want? Even less choice in the console market than we already have? An even stronger assurance of Nvidias monopolistic dominance in the graphics card market?
Personally, it scares me.
Re:a scary union (Score:1)
Hey, you needn't worry about a Sega/Nvidia monopoly in the console market! Remember, competition will be provided by Microsoft's X-Box!
uh... that probably doesn't make you feel much better, does it?
Re:a scary union (Score:1)
The NV2 will suffer, as it is before its time. (Score:2)
However, the PC does not have those high speed wide busses for interchange between main and graphics ram. I fear that this may cause some performance loss, as although the chip will still be good, it will not be optimal.
The limiting factor on games and graphics performance these days is not calculation speeds;it is bandwidth, pur and simple. In ten years, everything will be bandwidth. If you consider that in one clock cycle on a mosern chip in the 1 GHz range light travel only a foot or so, we can see that information interchange between CPU's and memory will become fundamentally limited by the speed of light within 18 months, as the distance from CPU to memory exceeds the distance that light travels in a clock cycle. The only way to improve is via bandwidth increases, and better use of cache.
But that is the future, and it is the future that this chip was designed for, in the form of consoles, which are always technologically ahead.
I fear this chip will suffer through being ahead of its time.
You know exactly what to do-
Your kiss, your fingers on my thigh-
turnabout (Score:2)
--
Re:NV1, a crying shame. (Score:2)
I guess the idea was that you could buy one card and have all your gaming video/sound/control needs taken care of.
More or less, in theory. In practice of course, none of the games I wanted to play on it ran on DirectX. Oh well.
Dave
Re:a scary union (Score:1)
ATI has a bigger market share. And as much as I love Sega, teaming up with them would probably drag anyone down (of course, the article mentions nothing about Sega and NVIDIA teaming up, just saying)
--
Re:a scary union (Score:1)
Yes, NVidia is a monopolist. I challenge you to find me a large company that is not. 3dfx was. They just failed. Intel is. AMD is trying to be. That's the nature of this our wonderful economy.
Anyway, my point is that as SOL as Sega is right now as far as their console systems, I don't see ANYTHING giving them an easy comeback barring a merger with Time Warner/AOL.
Justin Dubs
Nvidia earned their position (Score:3)
Nvidia achieved market dominance by providing good products at good prices, and coming out with new ones so fast to overwhelm their competitors. This is not an unfair business tactic, they are just completely ruthless competitors.
When Nvidia came out with the Riva 128, it had one advantage over the Voodoo. It could do 2D. And everyone had a 2D card, so it really didn't matter. It was slower with the early drivers, it had crap image quality, and it ran Quake 2 at 10fps or so on my P133.
The Voodoo2 came out and Nvidia had nothing but a slightly improved Riva 128, but at this point, people still didn't really care, but the 3d only thing was starting to have an effect. People would run systems with Riva 128s and Voodoo2s for good 2d and great 3d, and the option of real openGL.
With the TNT, Nvidia had a performance competitive product that had much better image quality too, and 3dfx's Banshee was a rehased, weakened Voodoo2. When the Voodoo3 came out, Nvidia promptly took the performance lead with the TNT2/TNT2 Ultra, albeit by a small margin. From then on 3dfx was way too slow to stand a chance. ATI started to come back into the picture here, and they have been tagging along since.
Nvidia releases new products too quickly for others to remain competitive, and they work quickly towards incorporating features OEMs want, lowering their prices, and look at the extremely dominant set of video chips Nvidia has today. Everything they have out is best of class, or damn near it.
MS won the market using completely ruthless tactics, and now they are their own greatest threat. Nvidia is still pumping out products like there is no tomorrow, and they are aware that if they get lax like 3dfx did, they will fall very quickly. ATI, while lagging behind, could jump right back into the front with one botched product release on Nvidia's behalf.
Also, if ATI gets their act together quickly and writes some decent drivers, and gets a comprable card out at a significantly lower price, then ATI will be able to get some of that lost marketshare back. It will take a better deal to beat Nvidia, however, because the established brand can always charge more for the same thing. And ATI has a huge disadvantage there too because they are competing with a ton of smaller boardmakers that can charge whatever they want.
Re:a scary union (Score:1)
Finally, the Dreamcast was in competition with the PS2, a console that didn't even enter the market more than a year after the Dreamcast. I was just amazed when I would talk to people about consoles and I'd mention the Dreamcast, and they'd say, "I'm waiting for the PS2".
As for the Nvidia thing... I wouldn't say that, "Nvidia is a known monopolist." Their lock on the video card industry is a fairly recent phenomenon... when the Geforce came out (1.5 years ago?). Anyway, there is fairly stiff competition for them on the horizon. Hardly a Microsoft situation.
Re:Quadratic surfaces (Score:2)
Re:a scary union (Score:2)
who?
Matrox's latest board is actually *slower* than their previous G400 Max card, and there still have been no announcements of the G800.
ATi has been losing their OEM market, first on the PC OEMs and now from Apple. Even their notebook line is now under pressure from nvidia's chips. Without an OEM market, they are done - even with a better chip, it would take a lot before people would believe that ATi isn't releasing a piece of crap (their drivers in the past have been.. sub-optimal)
NVidia has strong alliances with Microsoft and SGI, and now with their aquisition of 3dfx technology has blanketed the field of 3d graphics with patents.
Most of the graphics-oriented people I know are about 40/40 on which of Matrox or ATI will go under first: the remaining 20% don't acknowledge that the companies are still even competing. Last I checked, there isn't even anyone else trying to make a dent in the field.
Re:a scary union (definition of the word monopoly) (Score:1)
...yet
Re:Nvidia and new aliances (Score:1)
Re:Not ready for prime time (Score:1)
Now as to your comment product development efforts was less than agressive and the execution of the implementation was so poor that most mods were never even tested I'm not quite sure what you're saying here. You implying that nVidia has poor product execution? That was ture in the beginning but now history has proved you wrong. Let's run down some of the events:
--TNT comes out, not a bad card and provides a reasonable alternative to 3dfx that isn't so expensive.
--TNT2 happens, now not only is performance near Voodoo levels, but you can get better quality too.
TNT2U, first card I ever used that could push enough pixels to run games at a reasonable speed in 32-bit.
GeForce rolls on in. If this card wasn't juge, I don't know what was, I still have mine. It beat the snot out of all other consumer grade offerings by a long shot and even keeps up with the more expensive (and later released) Voodoo5.
GeForce 2, 6 months and about 4x the power. Initally the Radeon kept up but then Detonator 3 came out and it now consistently outperforms the Radeon.
We in the end are the ones who get screwed because product hits the street late and is not ready for prime time. I don't see how you can possibally make this claim. Every product I've gotten form nVidia has not only been ready for primetime right out of the box, but only gotten better as I've owned it. When I first bought my GeForce (about 2 months after they came out) I was floored by how much faster it was than my Voodoo3. Then, a few months later it got significantly faster with the release of Detonator 2 (5.x series) drivers in addition to gaining some new features. Later still Detonator 3 (6.x series)came out with another speed boost and a couple more features. Currently I have a beta of Detonator 4 (7.x series) and it again boosts speed and adds full DX8 support. Please remember that these speed increases are on top of already great performance. The GeForce was much faster than the Voodoo3 when it came out, Detonator 2 just widened the gap. The GeForce 2 competed well with the Radeon initally, Detonator 3 just ensured it won all the time.
I also don't see how you can think the cycle is slow. Looking at graphics chip progress lately I'd say power has been doubling about once every half year or so. compare that to processors that double once every 18 months or so. Plenty of advancement if you asked me. nVidia has also been making big gains in bringing features previously reserved to pro cards to regular consumers. While their geometery engines still can't compete with Glint's highend offering, its a start and its available for a price that reasonable.
At any rate if I've missed what you're saying, rewrite your post to be more clear, if not you need to do some research before shooting your mouth off. nVidia sure isn't the do-all, end-all of graphics, but they've done a whole lot.
OH and by the way, the NV20 isn't a "pipe dream" its very real and will be final ere long. You have to remember that chips just don't pop into existance, there is an amount of development time. The NV20 is off the VHDL and in to actual silicon. It's not at production quantities yet, but it will be soon.
Re:DONT CLICK THE LINK! (Score:1)
you mean choadsmoker?
i think someone learned a new word reading the bonsaikitten article.
Re:Quadratic surfaces (Score:3)
John Carmack
Re:Quadratic surfaces (Score:3)
It was my experience that subdivision surfaces are much more convenient for modeling free form organic surfaces, but polynomial patches can be more convenient for architectural work.
John Carmack
Re:Why are hardware curves bad? (Score:1)
Re:Nvidia and new aliances (Score:2)
Perhaps because the nVidia hardware is significantly faster than the Matrox G### hardware.
Really, if you want to insult the open sourced DRI effort, at least take the time to think the argument through first.
Re:a scary union (Score:1)
Anyway, I agree that nVidia makes great cards, but at least ATI's Radeon is giving the GeForce a run for it's money. I usually recommend GeForces, but the Radeons seem to be able to beat them (except the Ultra of course) at high resolutions in 32-bit. Competition doesn't go away when one company gets market dominance, just when all the other companies go out of business.
--
Nvidia IS a monopoly (Score:1)
Re:The NV2 will suffer, as it is before its time. (Score:1)
Justin Dubs
Nvidia and work ethic. (Score:2)
Penguin rhetoric. The Linux Pimp [thelinuxpimp.com]
Michael has it backwards (Score:1)
here [gamers.com]."over is article the. Polygons than rather (NURBS of derivative a) maps texture quadratic using was it. Dreamcast the for originally developed was which chip NV2 the about bit little a include they that is interesting this makes what. NVIDIA of history the article an posted just"SquadFiring [firingsquad.com] writes Alan
Re:a scary union (Score:1)
Nvidia and new aliances (Score:4)
Re:a scary union (Score:1)
Sega never almost reached monopoly in the console industry, youre thinking of nintendo. Sega are the ones who always seem to screw it up in the end.
Monopoly is not the same as success, nvidia have really smart guys working for them, they are successful because they are better than the competition.
How can it be a monopoly when theres a choice, if other graphics vendors don't produce as good cards, and aren't taken on by the gamers( the biggest driving force behind PC graphics card success) then its their fault for not having what it takes..
Sega are the 3dfx of the console world.
Its folks screaming monopoly every time a successful computer company comes along that scares me...
Re:Consoles aren't getting more expensive (Score:1)
Re:a scary union (Score:1)
No flame intended but snarking at them losing their OEM market has nothing to do with it and Apples notebook line could give a rats ass about NVidia. MS and NVIDIA have no relationship at all. WTF? Are you a troll? Unfortunately ow I have bitten, so maybe I'm just being a chump, but sinece I've already responed , I'll finish.
Look, your figures don't add up. Matrox, ATI etc. have much less than 80% ofg the market. They are more around 7% and to top it off everyone including pegasus and gen-h are in the mix. and gaining marketshre quickly, especially being supported by OpenSoure. Look at KPW, thy pulled 800% growth last year in their direct competitiion with NVIDIA. I believe you need to check your facts.
Hardware Monopolies (Score:3)
-={(Astynax)}=-
Re:Nvidia and work ethic. (Score:1)
Similarily, NVidia only provides TNT2 and below drivers for OS/2.
------
Re:a scary union (Score:1)
However, there is definitely something to be said of the sum being greater that its individual parts and it's not like both companies don't have marketing and research resources.
Re:a scary union (Score:5)
NVidia does not have a monopoly in any market right now. What they do offer is the best (IMHO) consumer-level graphics card on the market. As well as the second-best. And the third-best. (I'm talking GeForce2 Ultra, GeForce2 Pro, and GeForce 2 here.) That is very different from having a monopoly.
The ATI Radeon is also a reasonably good card, especially if you get the all-in-wonder version. EvilKyro, or whatever it is called, looks interesting, but I don't know all the details about it yet. The point is, NVidia does not have a monopoly.
If you think about it, NVidia is now at the point that 3dfx was at a few years back. A year ahead of the competition. However, unlike 3dfx, NVidia is not encouraging people to use a proprietary API that only works with their own hardware. The best way to access an NVidia card is via OpenGL (even on Windows), which is the most open, cross-platform 3D API starndard out there. NVidia has implemented proprietary extensions to OpenGL, but the extensions are purposely laid out such that it is easy to write software which only uses the extensions if they are available, and uses GL standard calls otherwise. (I know this from experience...)
Furthermore, unlike some well-known monopolistic companies, NVidia is still innovating at an alarming rate. As long as they keep doing so, and their prices stay where they are (GeForce 2 for $130, anyone?), I'm happy.
------
NV1 and Sega Saturn Controllers (Score:1)
I'm a huge Sega Saturn fan. I have lots of Saturn controllers. Does it seem likely to anyone that I could slap this board in my box as a second VGA controller, and then mostly use the controller port, and nothing else? I could always slap my 13" Mono VGA monitor on there if I absolutely HAD to have something on it.
Also, while I'm on the subject (and slightly offtopic), anyone know if there's some other way to get Saturn controllers to work on a PC without having to do any soldering? (I know I could do it myself if I wanted to break out the iron. That's not what this is about.)
The Saturn 3D Pad controller is perhaps the best non-vibrating game pad of all time. Analog joystick, digital pad, analog triggers, six buttons, and start. It also has a mode in which it emulates the normal saturn game pad. This controller kicks ass. I could put so much smack down in MW4 with that thing...
--
ALL YOUR KARMA ARE BELONG TO US
Nvidia a monopolist????? (Score:1)
Re:counsoles vs PC's (Score:1)
It'll take a few years for consoles to surpass this
-since when did 'MTV' stand for Real World Television instead of MUSIC television?
The sordid history of the industry leader (Score:1)
---
Re:a scary union (Score:3)
The difference here between Nvidia and microsoft is that Nvidia isn't using monopolistic tactics, and there are viable competitors on the same platform. Nvidia just has better products, and noone else has a better card. Would you say that the Radeon isn't a good card with a nice bit of market share? If you go to any Gateway store (I was there earlier today) and want one of the flat panel monitors, which card do you end up with? Not the Nvidia product, but the 32mb Radeon.
Sorry, but it's really not fair to put them in the same category as Microsoft just because they have the best cards out there that people want.
Re:Not ready for prime time (Score:3)
---
NV1, a crying shame. (Score:3)
(a) The first Direct3D accelerator and
(b) Compatible with the Sega Saturn.
Both of which turned out to be bullshit. While it did ship some actually quite good DirectX drivers, the direct 3d aspect of it was being "worked on". They did eventually ship some, with a huge disclaimer along the lines of "we know these things blow chunks, it's all due to this quadratic surface thing". This was all in the middle of John Carmack's big Direct3d rant on usenet ("I am looking forward to doing an apples to apples comparison of Direct3D and OpenGL" turned into "I am not going to finish, and there will never be, a Direct3d port of quake").
It also had onboard sound that was also very damn good. The MPU401 in particular was of near sound canvas quality. Unfortunately it was not sound blaster compatible and since the direct 3d port of quake was never going to happen, games basically stayed in DOS and sound effects had to be sent to a separate card.
Saturn compatibility turned into "Sega will be porting games", which of course they never bothered to do because only eight people bought the cards.
And Diamond were just shit about the whole thing. Haven't bought a thing from them since, don't know anyone else who has either.
So in essence: Great silicon, some serious forward steps were taken; Shitty marketing, I guess they learned.
Dave
BTW, while we're on the subject of bullshit graphics accelerators, did anyone ever get texture mapping going on a Millennium2?
Re:turnabout (Score:4)
The modern approach to drawing these curved primitives is to tesselate them into many triangles. If you had to draw a cubic spline using some quadratic primitive, you would have to "tesselate" your cubic primitive into smaller quadratic ones.
Tiling is where you divide your screen into tiles (say, 16x16 pixels per tile). The hardware buffers all your glTriangles (or whatever), doesn't draw a thing until you glSwapBuffers (which would normally display the picture). At that point, it will sort all the triangles into the tiles (some triangles stradle several tiles). Then, each tile is drawn one after the other. The main advantage is that this per-tile work can all be done on-chip instead of doing it in frame buffer memory. Frame buffer bandwidth is one of the most limiting aspects of modern graphics chip design.
The Kyro also uses a scan-line algorithm (if I understand correctly). In particular, it actually sort all the surfaces from front to back (or maybe back to front) to get transparencies correctly (or at least, its Sega incarnation in the DreamCast does this). Unfortunately, this great feature will run against 3d API's (both DirectX and OpenGL) which don't sit easily with that kind of business. Nevertheless, I do believe that such an extension will go into the API's in the near future; the developers are asking for it and the chip designers are trying to get it working.
I design 3d chips for an important graphics company but because of my company's culture, I'd better stay anonymous and not even say whom I work for.