Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software

A Brief History Of NVIDIA And SEGA 80

Alan writes: "FiringSquad just posted an article on the history of NVIDIA. What makes this interesting is that they include a little bit about the NV2 chip which was developed originally for the Dreamcast. It was using quadratic texture maps (a derivative of NURBS) rather than polygons! The article is over here."
This discussion has been archived. No new comments can be posted.

A Brief History of NVIDIA and SEGA

Comments Filter:
  • PowerVR was on the PC long before the Dreamcast.

    The PowerVR PCX2 was the highest-profile chip from PowerVR (an alliance between VideoLogic and NEC), which shipped mid-late 1997 as an add-on solution from VideoLogic and Matrox. The cards had 4MB of RAM on them, but that was all texture memory as it was able to share the framebuffer with the 2D card. That also enabled it to run 3D in a window, something the Voodoo1 couldn't do. Its MiniGL drivers for Quake-based games were really good and put them on par with a Voodoo1, but their Direct3D drivers stunk. Power SGL games ran great, but they were very few and far between, and the most high-profile one was probably Unreal. The PowerVR Second Generation chip (PVRSG) was supposed to ship mid-late 1998, but it kept getting delayed until it finally appeared on the Dead-On-Arrival Neon 250 around the end of '99.
  • I normally keep pretty quiet when it comes to conspiracy theories, but this poster is right on the mark.

    I don't know about NVIDIA being a monopolist like Microsoft, but he's sure on to something that has some insight. Look, the HW/SW graphics development areana is only controlled by a few seledct vendors and now that hold is merging between the console industry and the desktop industry. As a consumer i want as many choices as I can get and if NVIDIA becomes the top playah with their rep, it will surely drive competition down and prices up.

    I don't thing the whole console market would fall but I would sure like to get the next gen console without having to pay upwards of $350 US to get it.

  • by Christopher Thomas ( 11717 ) on Friday February 09, 2001 @09:25PM (#442376)
    The hardware folks like open, certified specs [PCI, AGP, USB, etc.] that they can conform to. Any company with a few talented chip engineers could make a card that outperforms the GeForce 2's, make it plug into any AGP board, and compete.

    While to some extent that's true, in practice you'd get your arse sued off until you could prove in court that you really *weren't* violating any of nVidia's implementation patents (or anyone else's). There were a few sabre-rattling sessions last year in this vein.

    There are only a few straightforward approaches to building any given part of a graphics pipeline. Just about all of these are patented (usually preemptively) by the big graphics card companies. It's not as bad as the software patent arena, but it's still not nice.

    If you have lots of money, you can hold out long enough that the big companies will offer to cross-licence technology with you. Otherwise, you'd better pray that they don't consider you a threat.

    Perhaps it's not quite this bad, but you'd have to do quite a bit of patent research to avoid stepping on anyone's toes.

    Lastly, a nit-pick re. AGP, PCI, and USB. For some of these, you have to pay licensing fees to get the specification manual. For all, if I understand correctly, you have to pay licensing fees to build any hardware that talks to them. The standards bodies are money-driven too.

    In practice, the cost will be low compared to the cost of the rest of your card, but it's still there.
  • If you're using Linux, Radeon is arguably a better choice. Open specs, free (as in speech) drivers. It's also a (slightly) better performer than the GF2 in 32-bit land. I'm using a CVS snapshot of the new X server with Radeon-enabled DRI right now. It's pretty sweet. Quake doesn't run on my set up fast enough yet, but GL-enabled Doom sure is nice!
  • But it's perty.
  • This is flamebait if I ever saw one..
    You will never have a console game that looks better than a computer counterpart.. consoles just can't keep up with the dev cycle of computers.
    Sure, they might look good for 4-6 months, maybe a year after release.. but computers will always surpass them (and consoles are usually around for 4+ years..)

    In fact, when quake2 was being ported over to the psx and n64, they had to /decrease/ the levels' size and use smaller textures. (Especially the n64 because of the small video ram in it)

    Plus there's the addons that you can't use on a console version of a comp game.. no hope of adding user made maps, mods, player models, etc.
    And on top of that, you can't play the same game on the latest console.. they're all incompatible (The same way comps were back in the early 80's)..
    If I wanna play quake1, 2 and 3.. then I need quake 1 and 2 for psx or n64 and then 3 for my dc.. If I was doing the same thing with my comp then I'd just need to install them and start playing.

    hrm.. this is too offtopic as it is..
    -since when did 'MTV' stand for Real World Television instead of MUSIC television?
  • by mTor ( 18585 ) on Friday February 09, 2001 @10:14PM (#442380)
    What makes this interesting is that they include a little bit about the NV2 chip which was developed originally for the Dreamcast. It was using quadratic texture maps (a derivative of NURBS) rather than polygons!

    I'm sorry but this makes no sense. NV2 chip used qudratic surfaces and not quadratic texture maps. This is like comparing apples and oranges.
  • The Evil Kyro is a fairly good budget card because it uses tile based rendering and 64MB of RAM for around $120. It's real failure though is the absence of hardware T & L. The next generation will have it and should be a worthy competitor in some niche, although I don't expect it to hold up to the NV20.
  • I agree that nurbs, and most other patches suck, But that is why Subdivision is so greate, you have the arbitrary topology, yett it is curved. Some argue that subdivisions do not have a natural uv space and is therfor "broken". To me this is allso a great becouse it gives us the same level och control as whit polygons to define our own texture space. A creased subdivision implementation can do everything a polygon implementation can and more.

    Dont bash curved surfases just becouse patches doesnt do what you want them to.
    Sub division really is the future, maybe it is too early to implement in a game today. But it is defenetly possible to implement a engine based on subdivision surfaces and get goor results. I have done it.

    http://verse.sourceforge.net/
  • I would think that being the provider of the graphics chipset and one of the groups giving MS input on directX 8.0 would definitely show a Microsoft - nVidia alliance. I must say though that there are markets where nVidia doesn't have the upper hand, and in fact can't seem to figure out how to compete. That is the high end graphics market (like designers and such). nVida's tech is all about pushing pixels, and the designers spend more time with wireframes and a few other features. Last I heard, diamond had them creamed in this market.

    Unfortunately I have to say that nVidia has cleaned up in the consumer market. Matrox has religated itself to building solid business cards (which is why the g450 ended up being slower than the g400) and ATI has completely lost in my mind because thier drivers suck.

    With the loss of 3dfx as a resonable competitor I would have to say that nVidia has a virtual monopoly in home gamer 3d graphics arena. I just hope they don't stop building significantly better cards for that arena just because they are able to make money at the status quo.

  • Nvidia was the underdog. They developed all of their own chipsets in-house, and only bought 3Dfx after 3Dfx had already been dethroned.

    I've been in the PC industry for many years, so I am quite aware of the market. I bought my first Nvidia card quite cautiously (the Riva TNT), and installed it in a machine which sat right beside a machine with a Voodoo 3 2000 inside. The differences in graphic performance was minor, but the Riva TNT had drivers which were updated more frequently, seemed more stable, and the card was cheaper. This process has been repeated with the Riva TNT 2, Voodoo 3 3000, the Voodoo 3500 TV, Voodoo 5500, and various incarnations of the Nvidia GeForce/GeForce 2, both in my personal computers and in the machines assembled for customers. Starting towards the middle of last year, most of our more knowledgeable customers weren't buying the Voodoo series, but were asking for cards in the Nvidia series.

    Plain and simple: Nvidia conquered 3Dfx by better pricing, equal or better technology, and a more rapid development cycle. No one expected it, any more than anyone expected AMD to ever better Intel.

    This happens in software as well:

    Wordstar, WordPerfect, Word.

    VisiCalc, Lotus 1-2-3, Excel.

    Dbase, FoxPro, Clipper, Access.

    Christ, it was amazing when CP/M was dethroned by MS-DOS, or when CD's bested LP's or tapes.

    The technology doesn't have to be better (Motorola versus Intel, VHS versus Beta), but Nvidia was at least the equal, and arguably better than, 3Dfx. It wasn't a matter of buying "the little guy, re-lable the IP and sell[ing] it as their own."
  • Actually, it appears that ATI lost out to Nvidia a while ago per this [cnet.com] article. And with Apple Computer's recent decision to not only offer Nvidia chips on their new macs, but make them the default option, how long before Nvidia rules the world? Don't get me wrong, I like Nvidia, I have an original TNT in my computer. I just think that a little competition is a good thing.
  • A 2d graphics card that a LOT of people happen to have and lets them use pretty much all of Be. If you decide not to call that a driver you're just being an asshole.

    -----------------------

  • All water under the bridge now, of course. NV20 rocks. :-)

    How do you know? Have you been tweaking around with a prototype? I'd sure like to hear NVidia's announcement on the NV20 being named and released, or at least know when to expect it to be released.

  • > and the implementation of games thereof.

    Er, games of subdivision? I think the word you're looking for is "therewith".

    OK, I'll fuck off now.
  • One VAR has already hinted to the fact that there will be an NV20 product in March. nVidia is on track with this. I'm sure high profile devs such as Mr. Carmack have samples.
  • JC: I am on record as saying that curved surfaces aren't as wonderful as the first seem, though.

    I know, you didn't turn up at GTS....

    The bad thing is that Hardware vendors are trying to come up whit hardware that can do nurbs and other patch surfaces that basically suck instead of creating some thing general.

    All points on a NURBS/Loop/Butterfly/catmull-clark surface are basically just weighted of a number of CVs in the control mesh. This means that you can for each vertex create a list on CVs that it depends on and quickly re-shape the curved mesh if the control CVs are moved. This could be done in hardware, and would really be a easy way to accelerate all types of tessellated surfaces (even polygon reduced meshes).

    I have implemented this in my experimental api: www.obsession.se/ngl
  • Dont bash curved surfases just becouse patches doesnt do what you want them to.

    I don't think it's bashing it. Curved surfaces have their problems too, like they don't add detail. Curviness isn't detail, it's smoothness, and even though it looks a lot better with low-poly models most of the time, subdivision doesn't add much to a model over 8000 polys. Very often you see yourself adding polygons that are not smooth, but tend to have detail (hard edges) in them.

    John Carmack pointed out the texturing problems, but there are loads more, like level of detail, texture coordinates and material seams. Triangles are about as stupid primitive that you can have as a graphics primitive, but they are extremely convenient. For example, their bounding volume is bound by their control points, they are discrete (i.e. can be detached and manipulated individually) and they can be textured in any way possible.

    I have the feeling that most graphics primitives (such as curved surfaces, maybe not including subdivision surfaces) are going to lose from triangles.

    Put it simply: complexity of geometry tended to be a problem in 3D graphics, but with the extreme bandwidth and pipeline increases we see today, the stream of triangles is becoming less of a problem. A good LOD algorithm can get the triangles/pixel ratio down to around 1 which will be a very feasable ration in a very short while.

    Most of the added processing time nowadays goes towards shadowing, detail maps, projective lights, motion blurring/oversampling/other frambuffer tricks, but it seems the triangle limit is diminishing.

    All of these new techniques get easier with triangles. With adaptive subdivision these techniques get worse to implement. Try to do animated soft-skinned subdivision surfaces with adaptive subdivision and shadow-casting.

    Triangles are stupid, but in the end, all graphics cards draw triangles. I think that curved surfaces won't be all what they were hyped up to be, they simply add too little at too much cost.

    Michiel

  • I find it interesting that the earliest NVIDIA engineers were from Sun. To me, this explains A) their pre-occupation with surfaces as a primitive, long a tendency within Sun due to their MCAD focus, and B) their appreciation for the value of short cycle times. 3dfx, by contrast, was founded in part by engineering from SGI (e.g. Gary Tarolli) who had a stronger appreciation of how to optimize great 3D technology and astutely balance technological tradeoffs, but perhaps less cultural appreciation for obscure business issues like cycle times and ability to execute.

    Just as in the workstation market, Sun was able to muscle-out SGI by steadily raising the bar and focusing on good-enough-for-my-core-markets improvements, so NVIDIA beat 3dfx. Although one could always argue that 3dfx beat itself. Still, in my mind, the turning point was when NVIDIA hired a lot of key engineers fleeing SGI which was in the process of spiralling downward. If 3dfx had gotten that team, we might be looking at a different story.

    In any event, it's remarkable that engineers from either Sun or SGI were able to shift their mindset from product cycles of 3 years (workstation graphics) to product cycles of 6 months (PC 2D graphics a la Cirrus Logic). If you've grown up building projects the slow way, a faster tempo is not easy to adjust to! I've always thought that besides his brilliance, the psychological development of John Carmack's early exposure to 1 month product cycles helped him learn the focus needed to keep his current 18-month(?) product cycles going, something his competitors have really struggled to match.

    It'll be interesting to see if the lack of discipline and business focus so evident within SGI engineering begins reasserting itself at NVIDIA as pressures on them ease and the early SGI engineers look at the comfortable value of their options which should be fairly vested by now. As a consumer, all I can say is that thank goodness ATI is around to keep the pressure on them. :)

    People matter. There's an interesting book about how the 3D hardware market was won and lost just waiting to be written.

    --Greg
  • by DeeKayWon ( 155842 ) on Friday February 09, 2001 @06:36PM (#442393)
    Right after a lawsuit by SGI against nVidia over a PCI DMA technique was settled, they allied and nVidia now has a lot of SGI IP at their disposal. The OpenGL stuff they got from SGI is probably why they can't open up the source to their XFree86 drivers.
  • I don't think he/she's dumb. NVidia dethrowned (sp) them using IP developed and consolidated by leverage buyout, not a better R&D pipeline. Check your facts. NVidia is not the biggest player on the block , but they are known for their buy vs. build mentality. This is a known formula that always seems to end in bankruptcy.

    Just my 2 cents.

  • It scares me too. Any decrerase in competition (ha, I said tit) is bad, and will ultimately lead to higher prices and less selection for us.
  • anyone know how similiar that is to what NVIDIA tried with quadratic surfaces?

    Not at all. Quadratic surfaces were a fundamentally better way of explaining what shape something was in three dimensions. The tile based rendering still has polygons, but breaks the screen up into tiles to do the final render. IIRC the optimisation runs something along the lines of "is this tile entirely covered by something, all of which has a 'nearer' Z? if so, bugger it, I won't even try".

    They scale particularly well, BTW. The PowerVR chip was designed to minimise bandwidth and be scaleable to 16 ways for arcade machines. The tiles glue together in a kind of smarter SLI fashion.

    There were also early PVR1 accelerators around that rendered to a lump of memory for the 2d card to then actually display. Obviously a bit flawed for today's framerates but a perfectly valid option at the time. Again, driver shiteness killed these off and I have a suspicion that NEC/Videologic signed an exclusive deal with Sega for the PVR2.

    I hope they can get PVR2 accelerators together for the PC. Toms' benchmarks give the impression of an altogether more interesting approach than the Nvidia 'hit it with more grunt' thing that's oging on at the moment - and potentially a greater advantage as the number of vertices increases over sheer fillrate requirements.

    Still. I'm relatively happy with a voodoo3, so don't look at me.

    Dave
  • I am frightened of the day when AOL picks up a console system company. They will be worse than nintendo was for two reasons. The games will be even stupider than a great deal of the nintendo games (I hated that "kid-oriented" phase of nintendo) and everyone will play them anyways.

    Although it would be kind of nice to get a free console in the mail once a month :P Unfortunately it will only cost $200 a month to get it activated and you will have to download your games from the net every time you want to play.

    They will also put rotating banner ad's at the top of the games and a little display that sucks your mind out in the control, and if you call thier 1-800 number they will install a device in your wallet that automatically sends all your money to them.

  • What in god's name is a quadratic texture??? I can imagine 2D or 3D procedural textures defined in terms of quadratic equations, but i somehow doubt this is what michael is on about.

    He's probably talking about these cards having hardware support for quadric surfaces.. but since its been fairly-much decided that its generally faster to decompose these kinds of surfaces to triangular meshes for rendering,especially in pipelined graphics architectures such as OpenGL, features like this don't get used much.

    Raytracers like POV-Ray and others evaluate quadrics and other classes of curved surfaces on a pixel-by-pixel basis, and other renderers subdivide them down to sub-pixel sized triangles for rendering. However, this approach is seldom suitable for realtime application.

    Most likely, this 'quadratic texture support' is just hardware implementation of OpenGL evaluator functions which simply assist in the decomposition of quadric surfaces to a triangle meshes.

    Get a clue michael.

  • by Anonymous Coward
    > The OpenGL stuff they got from SGI is probably why they can't open up the source to their XFree86 drivers.

    That's stupid and doesn't make any sense at all. SGI open sourced the OpenGL reference implementation and GLX. Smart money is on the AGP code that nvidia licensed.

    Of course, none of this explains why they won't released the fucking info needed to write an independent Mesa + DRI driver for their cards. Their drivers are great, but an open source one would be even better. Since we can't really expect them to release the source for their drivers, releasing the programming info would be a reasonable substitute.
  • Wrong, the GeForce is now supported under Be, has been since late 2000.

    -----------------------

  • Really? Does it support full 3D accelleration? If not, I don't cound that as a "driver". A GeForce without 3D is just a generic 2D graphics card.

    ------
  • I work for a low-profile SMALL game company, but we are making an X Box game, and we have NV 20 samples. In fact, I believe it would be possible for me to buy an NV 20 for about $500 directly from nVidia. I'm sure Id has a few.
  • by John Carmack ( 101025 ) on Saturday February 10, 2001 @12:07AM (#442403)
    The article hints that the NV1's quadratic surfaces might have actually been a good thing, and it was held back by Microsoft's push to conformity with triangles.

    Er, no.

    For several years now, Nvidia has been kicking ass like no other graphics company, but lets not romanticize the early days. The NV1 sucked bad, and it would have been damaging to the development of 3D accelerators if it had gotten more widespread success. Microsoft did a good thing by standing firm against Nvidia's pressure to add quadratic surfaces to the initial version of D3D.

    There is an intuitive notion that curved surfaces are "better" than triangles, because it takes lots of triangles to aproximate a curved surface.

    In their most general form, they can be degenerated to perform the same functions as triangles, just at a huge waste in specification traffic.

    Unfortunately, there have been a long string of products that miss the "most general form" part, and implement some form of patch surface that requires textures to be aligned with the patch isoparms. This seems to stem from a background in 2D graphics, where the natural progression from sliding sprites around goes to scaling them, then rotating them, then projecting them, then curving them.

    3DO did it. Saturn did it. NV1 did it. Some people are probably working on displacement mapping schemes right now that are making the same mistake.

    Without the ability to separate the texturing from the geometry, you can't clip any geometry in a general way (not even mentioning the fact that clipping a curve along anything but an isoparm will raise it's order), and you either live with texel density varying wildly and degenerating to points, or you have texture seams between every change in density. No ability to rotate a texture on a surface, project a texture across multiple surfaces, etc. You can't replace the generality if a triangle with primitives like that.

    Even aside from the theoretical issues, NV1 didn't have any form of hidden surface removal, and the curve subdivision didn't stitch, frustum clip or do perspective. It was a gimmick, not a tool.

    All water under the bridge now, of course. NV20 rocks. :-)

    John Carmack
  • By that definition Microsoft is nowhere near a monopoly because it has competitors that are no where near going under. Notice I said virtual monopoly, meaning that they command a large portion of the market and can for all intents and purposes choose thier prices.

    I already mentioned that they were not the only company, just the only one with much power in the market. I don't put nVidia down for it, just wish other companies would jump in and make a new good product.

  • HeLLo I AM JokEmon And I DEclARe MySElf ThE RuLer Of SLaSHDOT FEGGETS
  • by Lotek ( 29809 ) <Vitriolic@NOSpaM.gmail.com> on Saturday February 10, 2001 @10:17AM (#442406)
    Dude. John Carmack.

    Video card makers actually ASK HIM what he wants to see in the next generation video cards, then scurry off and figure out how to do that. I don't doubt that they send him a few engineering samples once they have working prototypes.

    No company is so suicidal that they are going to create a video card that won't run Doom2001 (or whatever its going to be called) Perfectly.

  • by John Carmack ( 101025 ) on Saturday February 10, 2001 @11:23AM (#442407)
    My point was that with texturing tied directly to the patch orientation, you can't do exactly the thing that you describe.

    I'm not a big booster of hardware curves in any form, but I only rail against hardware schemes that use aligned textures.

    John Carmack
  • Through aggressive product development, near-perfect execution, and very strong arms, NVIDIA rose to the top of the industry. Today, NVIDIA owns all of 3dfx's intellectual property, has numerous OEM PC deals and has recently become Apple's default high-performance desktop graphics provider. Soon we'll see the mobile GeForce2 Go shipping in laptops from Toshiba, and we can't forget the upcoming introduction of NVIDIA's next generation NV20 GPU. After that, NVIDIA can look forward to the launch of Xbox as well as its new line of motherboard chipsets.

    In a word. . . Bullshit

    Heres why. First the product development efforts was less than agressive and the execution of the implementation was so poor that most mods were never even tested. NVIDIA is one of those companies that never really understood that the eighties were over. Leverage buyouts of other peoples IP based solely on future profits is a recipie for disaster. The GeForce2 efforts are not ready for delivery to Toshiba and Toshiba is pissed. Unfortunately Toshiba is on the hook for delivery and the'r going to be stuk until NV20. The X offering is only a pipedream at this point.

    Look, efforts like this are based upon the waiting for little companies to develop a similar offering and then they buy up the little guy, re-lable the IP and sell it as their own. We in the end are the ones who get screwed because product hits the street late and is not ready for prime time.

  • Can you buy a fast pc that runs unix for the price of a console? do you need to run a server if all you want to do is play games? consoles aren't silly. they fit their niche perfectly.
  • Although the NV1 was technologically superior to other chips of that era from a number of perspectives, the proprietary quadratic texture mapping of the NV1 was its death sentence. When Microsoft finalized Direct3D not too long after the NV1 had reached store shelves, polygons had been chosen as the standard primitive, and despite NVIDIA's and Diamond's best efforts, developers were no longer willing to develop for the NV1.

    This is nonsense and self-contradicting. Microsoft's direct3d didn't kill nv1, a closed source proprietary texture mapping procedure that was incompatible with any other card killed nv1. Had nvidia worked closer with developers before direct3d became the standard, nv1 could have been accepted and implemented on a larger scale. True, direct3d replaced nv1, but blaming microsoft for the death is pointless: proprietary policies killed nv1, not its competition.
  • I inadvertently bought a video card a few years ago called the Diamond 3000 or somesuch. It was able to play Sega Saturn games, I believe, or maybe just Sega Saturn games lightly modified to play on the PC using this card. It had 2 Sega Saturn joystick ports, and I think it came with 2 joysticks... a friend of mine had a Saturn and tried those instead, and they worked. I still have that darn card... never really used it to it's potential.
  • The idea of Sega and Nvidia working closely together is a scary thought. They would bring a lot of power in any field they were involved in. I imagine some of you readers are mainly just interested in the technological aspects of the situation, but I find the economical aspects a bit troubling.

    Nvidia is a known monopolist. No one can touch them in the graphics field now. They are surprisingly simalar to Microsoft, except there is no equivalent Apple in the graphics field to provide a counterpoint.

    Sega almost reached the levels of monopolist in the console industry, but fell short. If they work with Nvidia, it's conceivable the console market would fall to this conglomeration.

    Is this really what we want? Even less choice in the console market than we already have? An even stronger assurance of Nvidias monopolistic dominance in the graphics card market?

    Personally, it scares me.

  • Sega almost reached the levels of monopolist in the console industry, but fell short. If they work with Nvidia, it's conceivable the console market would fall to this conglomeration.

    Hey, you needn't worry about a Sega/Nvidia monopoly in the console market! Remember, competition will be provided by Microsoft's X-Box!

    uh... that probably doesn't make you feel much better, does it?

  • If you read more of the article.. Nvidia only worked with Sega for a little while then Sega dropped Nvidia.. this was back when nvidia was small.. i think about 1993-4 ?
  • As this chip was designed for the console, rather than the PC, I have to question how good it is going to be, and how well developed for the PC market. It is designed for consoles, which have very wide busses between graphics ram and main ram, and fast dedicated support chips for this environment.

    However, the PC does not have those high speed wide busses for interchange between main and graphics ram. I fear that this may cause some performance loss, as although the chip will still be good, it will not be optimal.

    The limiting factor on games and graphics performance these days is not calculation speeds;it is bandwidth, pur and simple. In ten years, everything will be bandwidth. If you consider that in one clock cycle on a mosern chip in the 1 GHz range light travel only a foot or so, we can see that information interchange between CPU's and memory will become fundamentally limited by the speed of light within 18 months, as the distance from CPU to memory exceeds the distance that light travels in a clock cycle. The only way to improve is via bandwidth increases, and better use of cache.

    But that is the future, and it is the future that this chip was designed for, in the form of consoles, which are always technologically ahead.

    I fear this chip will suffer through being ahead of its time.

    You know exactly what to do-
    Your kiss, your fingers on my thigh-

  • The article mentions Sega dropped NVIDIA in favor of PowerVR for producing their console's graphics processor. What's interesting is PowerVR is now trying to move in on the PC graphics card industry itself with the Evil Kyro [http], which uses a tile-based rendering system; anyone know how similiar that is to what NVIDIA tried with quadratic surfaces?
    --
  • The impression you got from the linked article was right. The impression I got from Diamond's marketing was wrong. GGNNNNNN!

    I guess the idea was that you could buy one card and have all your gaming video/sound/control needs taken care of.

    More or less, in theory. In practice of course, none of the games I wanted to play on it ran on DirectX. Oh well.

    Dave

  • Nvidia is a known monopolist. No one can touch them in the graphics field now.

    ATI has a bigger market share. And as much as I love Sega, teaming up with them would probably drag anyone down (of course, the article mentions nothing about Sega and NVIDIA teaming up, just saying)
    --
  • Hmmmm.... no. Sega is on the down-and-out right now. They are ditching the Dreamcast and leaving buyers screwed. They ditched their last system too (anyone remember the Saturn. Yeah, that $600 piece of crap).

    Yes, NVidia is a monopolist. I challenge you to find me a large company that is not. 3dfx was. They just failed. Intel is. AMD is trying to be. That's the nature of this our wonderful economy.

    Anyway, my point is that as SOL as Sega is right now as far as their console systems, I don't see ANYTHING giving them an easy comeback barring a merger with Time Warner/AOL. :-). I just don't see it happening, NVidia or not...

    Justin Dubs
  • by Xevion ( 157875 ) on Friday February 09, 2001 @07:21PM (#442420)
    Nvidia never used unfair marketing tactics. They have used some questionable ones when it came to a few websites, but that has been fixed.

    Nvidia achieved market dominance by providing good products at good prices, and coming out with new ones so fast to overwhelm their competitors. This is not an unfair business tactic, they are just completely ruthless competitors.

    When Nvidia came out with the Riva 128, it had one advantage over the Voodoo. It could do 2D. And everyone had a 2D card, so it really didn't matter. It was slower with the early drivers, it had crap image quality, and it ran Quake 2 at 10fps or so on my P133.

    The Voodoo2 came out and Nvidia had nothing but a slightly improved Riva 128, but at this point, people still didn't really care, but the 3d only thing was starting to have an effect. People would run systems with Riva 128s and Voodoo2s for good 2d and great 3d, and the option of real openGL.

    With the TNT, Nvidia had a performance competitive product that had much better image quality too, and 3dfx's Banshee was a rehased, weakened Voodoo2. When the Voodoo3 came out, Nvidia promptly took the performance lead with the TNT2/TNT2 Ultra, albeit by a small margin. From then on 3dfx was way too slow to stand a chance. ATI started to come back into the picture here, and they have been tagging along since.

    Nvidia releases new products too quickly for others to remain competitive, and they work quickly towards incorporating features OEMs want, lowering their prices, and look at the extremely dominant set of video chips Nvidia has today. Everything they have out is best of class, or damn near it.

    MS won the market using completely ruthless tactics, and now they are their own greatest threat. Nvidia is still pumping out products like there is no tomorrow, and they are aware that if they get lax like 3dfx did, they will fall very quickly. ATI, while lagging behind, could jump right back into the front with one botched product release on Nvidia's behalf.

    Also, if ATI gets their act together quickly and writes some decent drivers, and gets a comprable card out at a significantly lower price, then ATI will be able to get some of that lost marketshare back. It will take a better deal to beat Nvidia, however, because the established brand can always charge more for the same thing. And ATI has a huge disadvantage there too because they are competing with a ton of smaller boardmakers that can charge whatever they want.
  • Actually, I've always been a big fan of Sega, but I've always thought of them as being "that other console maker". When SMS came out, it didn't have the number of games that the NES did. When the Genesis/MD came out, they had a great console, but Nintendo had the SNES that still dominated. Sega Saturn never really took off, even though it was a great system (IMO), due to competition from Sony.

    Finally, the Dreamcast was in competition with the PS2, a console that didn't even enter the market more than a year after the Dreamcast. I was just amazed when I would talk to people about consoles and I'd mention the Dreamcast, and they'd say, "I'm waiting for the PS2".

    As for the Nvidia thing... I wouldn't say that, "Nvidia is a known monopolist." Their lock on the video card industry is a fairly recent phenomenon... when the Geforce came out (1.5 years ago?). Anyway, there is fairly stiff competition for them on the horizon. Hardly a Microsoft situation.

  • Heh, i thought this sounded a little too intelligent for a slashdot post.... then i see the JC...
  • ooh, viable competitors?

    who?

    Matrox's latest board is actually *slower* than their previous G400 Max card, and there still have been no announcements of the G800.

    ATi has been losing their OEM market, first on the PC OEMs and now from Apple. Even their notebook line is now under pressure from nvidia's chips. Without an OEM market, they are done - even with a better chip, it would take a lot before people would believe that ATi isn't releasing a piece of crap (their drivers in the past have been.. sub-optimal)

    NVidia has strong alliances with Microsoft and SGI, and now with their aquisition of 3dfx technology has blanketed the field of 3d graphics with patents.

    Most of the graphics-oriented people I know are about 40/40 on which of Matrox or ATI will go under first: the remaining 20% don't acknowledge that the companies are still even competing. Last I checked, there isn't even anyone else trying to make a dent in the field.
  • *grin* originally, I didn't say they were a monopoly. They aren't...

    ...yet :-)
  • I'd rather have a closed source nVidia driver for performance reasons than an open sourced ones for political. The Matrox G### cards are pretty open, but the performance of them is only a fraction of the nVidia closed source drivers. Open Source is great, but isn't the solution to all problems. From appearances having fast OpenGL is one of those "isn't a solution" at least at this time.
  • Ok, now I don't want to flame you here like many others have done, but I just don't understand where the hell you're comming from: The GeForce2 efforts are not ready for delivery. You been to a computer store lately sonny? You can get a GeForce 2 in no less than 3 flavours (MX, GTS, Ultra) from multiple vendors. I have seen these things in actual, real systems running actual, real games and they actually, really rule. There is nothing not ready for delivery about it, it was delivered (months ago) and has been working great.

    Now as to your comment product development efforts was less than agressive and the execution of the implementation was so poor that most mods were never even tested I'm not quite sure what you're saying here. You implying that nVidia has poor product execution? That was ture in the beginning but now history has proved you wrong. Let's run down some of the events:

    --TNT comes out, not a bad card and provides a reasonable alternative to 3dfx that isn't so expensive.
    --TNT2 happens, now not only is performance near Voodoo levels, but you can get better quality too.
    TNT2U, first card I ever used that could push enough pixels to run games at a reasonable speed in 32-bit.
    GeForce rolls on in. If this card wasn't juge, I don't know what was, I still have mine. It beat the snot out of all other consumer grade offerings by a long shot and even keeps up with the more expensive (and later released) Voodoo5.
    GeForce 2, 6 months and about 4x the power. Initally the Radeon kept up but then Detonator 3 came out and it now consistently outperforms the Radeon.

    We in the end are the ones who get screwed because product hits the street late and is not ready for prime time. I don't see how you can possibally make this claim. Every product I've gotten form nVidia has not only been ready for primetime right out of the box, but only gotten better as I've owned it. When I first bought my GeForce (about 2 months after they came out) I was floored by how much faster it was than my Voodoo3. Then, a few months later it got significantly faster with the release of Detonator 2 (5.x series) drivers in addition to gaining some new features. Later still Detonator 3 (6.x series)came out with another speed boost and a couple more features. Currently I have a beta of Detonator 4 (7.x series) and it again boosts speed and adds full DX8 support. Please remember that these speed increases are on top of already great performance. The GeForce was much faster than the Voodoo3 when it came out, Detonator 2 just widened the gap. The GeForce 2 competed well with the Radeon initally, Detonator 3 just ensured it won all the time.

    I also don't see how you can think the cycle is slow. Looking at graphics chip progress lately I'd say power has been doubling about once every half year or so. compare that to processors that double once every 18 months or so. Plenty of advancement if you asked me. nVidia has also been making big gains in bringing features previously reserved to pro cards to regular consumers. While their geometery engines still can't compete with Glint's highend offering, its a start and its available for a price that reasonable.

    At any rate if I've missed what you're saying, rewrite your post to be more clear, if not you need to do some research before shooting your mouth off. nVidia sure isn't the do-all, end-all of graphics, but they've done a whole lot.

    OH and by the way, the NV20 isn't a "pipe dream" its very real and will be final ere long. You have to remember that chips just don't pop into existance, there is an amount of development time. The NV20 is off the VHDL and in to actual silicon. It's not at production quantities yet, but it will be soon.

  • What a Chodesmoker.

    you mean choadsmoker?
    i think someone learned a new word reading the bonsaikitten article.
  • by John Carmack ( 101025 ) on Saturday February 10, 2001 @11:25AM (#442428)
    The hardware curved surfaces in upcoming hardware is basically ok, because you can have all the attributes independent of each other at each control point. My major point was that the 3DO/Saturn/NV1 botched it badly by explicitly tying the texture to the patch orientation, which prevents them from being able to do triangle like things at ALL.

    John Carmack
  • by John Carmack ( 101025 ) on Saturday February 10, 2001 @11:29AM (#442429)
    I was ranting specifically about square patches that have implicit texture alignment, not curves in general. I am on record as saying that curved surfaces aren't as wonderful as the first seem, though.

    It was my experience that subdivision surfaces are much more convenient for modeling free form organic surfaces, but polynomial patches can be more convenient for architectural work.

    John Carmack
  • I'm curious, why aren't you a booster of hardware curves? It seems like a good idea to me, but I think you are *the* expert in this area. Is it because no one has yet created hardware curves that are sufficiently general? Is it because of poor implementations? Major slowdowns? Do they lack versatility? If HW curves are bad, please share your opinion as to which direction(s) the 3D graphics companies should be headed. What would you like to see in a top-of-the-line graphics card built 10-100 years from now? More memory, more texture units, voxels, stereo 3D, direct neural input, embedded DRAM, 64 bit color, 4x4 FSAA, etc.? How could they be improved that would make your life easier? Where do you think they could *best* use those transistors? Sorry, I got carried away and asked too many questions... :-) ...but I'm sure a lot of folks would like to know your thoughts on this subject.
  • The Matrox G### cards are pretty open, but the performance of them is only a fraction of the nVidia closed source drivers.

    Perhaps because the nVidia hardware is significantly faster than the Matrox G### hardware.

    Really, if you want to insult the open sourced DRI effort, at least take the time to think the argument through first.

  • Hmmm, I stand corrected. Was going on old data I guess. Though those numbers seem a little off; I don't see how you can go from 20% to 48% of all desktops; I don't think most people upgrade their graphics cards, and I can't imagine that enough new desktops were produced this year to produce that number. They probably meant in new desktops sold that year.

    Anyway, I agree that nVidia makes great cards, but at least ATI's Radeon is giving the GeForce a run for it's money. I usually recommend GeForces, but the Radeons seem to be able to beat them (except the Ultra of course) at high resolutions in 32-bit. Competition doesn't go away when one company gets market dominance, just when all the other companies go out of business.
    --
  • Whoever said Nvidia doesn't use monopolist tactics, thats complete and utter bullshit. Look at that supposed "cross-license" agreement between S3 and Nvidia. S3 obviously didn't get anything out of it, with Nvidia stealing S3TC royalty free. Nvidia has stolen so much technology its pretty sad. 3dfx sues Nvidia, Nvidia buys out 3dfx, if there was no guilt, why else did they do it? If Nvidia is good at engineering, then its just as good as stealing technology from its competitors, and then settling in court. Its also a master of legal manuever. Have any of you read the SEC filing by 3dfx, about the bankruptcy, and asset completion? If 3dfx shareholders do not approve the sale, regardless, Nvidia WILL STILL GET ALL of 3dfx's technology and trademarks. It reminds me of a lot of the crap Microsoft used to pull. As for Nvidia "Earning" its position, thats the most pathetic excuse for allowing a monopoly to exist I've ever heard of. All the monopolies, Microsoft, and the trusts in the early 20th centuries, "earned" their position in some way. Thats no excuse. And if you keep denying the fact that Nvidia is a monopoly, just go to a store, and ask them what the price of a Geforce2 Ultra is, that'll remind you what a monopoly is. And then think back to how much you paid for that TNT2 Ultra in your system. So all you Nvidiots out there, think about the company you love so much, and I hope none of you hate Microsoft either, otherwise its gross hypocrisy.
  • I'm trying to figure out why you keep using the future tense. The NV2 is long gone. It WAS planned for the dreamcast but was ditched for 3dfx and PowerVR. The NV2 is pre-RIVA128 (NV3) which is pre-TNT which is pre-TNT2 which is pre-GeForce which is pre-GeForce2 which will not to long from now be pre-NV20. The NV2 will not be a failure, it WAS a failure. A big one. Poor NVidia, I hope they recover... ;-)

    Justin Dubs
  • I really can't say too many bad things about Nvidia. They constantly pump out new drivers [tweakersasylum.com] in order to satisfy feature hungry consumers. They go so far as to offer drivers for not only Windows, but Linux, BeOS and even OS/2. At this rate though, maybe one day soon, I'll have to worry about doing driver updates on my game console in order to get that extra fps.

    Penguin rhetoric. The Linux Pimp [thelinuxpimp.com]

  • It should actually read.

    here [gamers.com]."over is article the. Polygons than rather (NURBS of derivative a) maps texture quadratic using was it. Dreamcast the for originally developed was which chip NV2 the about bit little a include they that is interesting this makes what. NVIDIA of history the article an posted just"SquadFiring [firingsquad.com] writes Alan

  • Honestly if there was a sega monolopy I'd rather give up consoles than use a dreamcast controller, but seriously dosen't PS2 and Gamecube prevent a monoply?
  • by tolldog ( 1571 ) on Friday February 09, 2001 @06:18PM (#442438) Homepage Journal
    What I am supprised to see is the lack of mention in the article to NVidia and SGI. The new SGI Intel boxes are running a variation of the nv15 chipset (and do a good job of it). Also, NVidia, if I remember right, has a ton of ex-SGI employees.

  • by Anonymous Coward
    the story goes sega turned off nvidia because they werent a japanese company. they'd have done better if theyd stuck with them. I can believe part of this having worked in japan, and being involved with sega, sony etc.

    Sega never almost reached monopoly in the console industry, youre thinking of nintendo. Sega are the ones who always seem to screw it up in the end.

    Monopoly is not the same as success, nvidia have really smart guys working for them, they are successful because they are better than the competition.

    How can it be a monopoly when theres a choice, if other graphics vendors don't produce as good cards, and aren't taken on by the gamers( the biggest driving force behind PC graphics card success) then its their fault for not having what it takes..

    Sega are the 3dfx of the console world.

    Its folks screaming monopoly every time a successful computer company comes along that scares me...
  • Consoles have gotten cheaper recently. The 3DO was $700 when it came out. The PSX was $300, I think the Saturn was too, and $300 is considered a max price for a console, and a Dreamcast is now $100. In a price competitive world, if something is a ripoff it dosen't last very long, so don't expect to see console prices get out of hand. The Gamecube is probably going to be $200, the Xbox $300, so they aren't exactly getting more expensive.
  • Show me the facts! Do you have a link? This to me seems factually incorrect.

    No flame intended but snarking at them losing their OEM market has nothing to do with it and Apples notebook line could give a rats ass about NVidia. MS and NVIDIA have no relationship at all. WTF? Are you a troll? Unfortunately ow I have bitten, so maybe I'm just being a chump, but sinece I've already responed , I'll finish.

    Look, your figures don't add up. Matrox, ATI etc. have much less than 80% ofg the market. They are more around 7% and to top it off everyone including pegasus and gen-h are in the mix. and gaining marketshre quickly, especially being supported by OpenSoure. Look at KPW, thy pulled 800% growth last year in their direct competitiion with NVIDIA. I believe you need to check your facts.

  • by AstynaxX ( 217139 ) on Friday February 09, 2001 @07:55PM (#442442) Homepage
    NVidia is not, and really is not likely to become, a monopolist in any meaningful sense. The hardware side of computers is not like the software. Most specs are open, the closed ones tend not to do well [only one I can recall in recent times is the ZIP, and I think that is due to a decent product, reasonable price, and good customer service when they screw up], just look at apple, or LS120 drives. The hardware folks like open, certified specs [PCI, AGP, USB, etc.] that they can conform to. Any company with a few talented chip engineers could make a card that outperforms the GeForce 2's, make it plug into any AGP board, and compete. The reason this doesn't work in software is because the specs aren't open, they are controlled by the monopolist. So, untill NVidia releases their own slot for graphics boards, patented by them and used soley by their cards, don't get your panties in a bunch over their 'monopoly'. As 3dfx showed us, such monopolies in hardware rarely last long.

    -={(Astynax)}=-
  • They do not offer drivers for BeOS. They have allowed Be to write drivers for the TNT2 and below, but there is no GeForce or GeForce 2 driver for BeOS, and NVidia has never written a driver for BeOS themselves. If you click on the "BeOS" link at NVidia's driver page, you'll see that it just re-directs you to Be.

    Similarily, NVidia only provides TNT2 and below drivers for OS/2.

    ------

  • I'm not one for conspiracy theories... Look at the history of the 3D card market. nVidia has toppled 3Dfx with a better product in a fairly short period of time. You can't say they're a monopoly, even as thousands are reaching for one of their cards right now. There are feasible competitors, research is ongoing and the market is still growing. It's quite conceivable someone else will surpass nVidia. Sega is hardly at the top of their market, especially with rumours of their troubles and pending reorganization and change of focus.

    However, there is definitely something to be said of the sum being greater that its individual parts and it's not like both companies don't have marketing and research resources.

  • by Temporal ( 96070 ) on Friday February 09, 2001 @08:34PM (#442445) Journal

    NVidia does not have a monopoly in any market right now. What they do offer is the best (IMHO) consumer-level graphics card on the market. As well as the second-best. And the third-best. (I'm talking GeForce2 Ultra, GeForce2 Pro, and GeForce 2 here.) That is very different from having a monopoly.

    The ATI Radeon is also a reasonably good card, especially if you get the all-in-wonder version. EvilKyro, or whatever it is called, looks interesting, but I don't know all the details about it yet. The point is, NVidia does not have a monopoly.

    If you think about it, NVidia is now at the point that 3dfx was at a few years back. A year ahead of the competition. However, unlike 3dfx, NVidia is not encouraging people to use a proprietary API that only works with their own hardware. The best way to access an NVidia card is via OpenGL (even on Windows), which is the most open, cross-platform 3D API starndard out there. NVidia has implemented proprietary extensions to OpenGL, but the extensions are purposely laid out such that it is easy to write software which only uses the extensions if they are available, and uses GL standard calls otherwise. (I know this from experience...)

    Furthermore, unlike some well-known monopolistic companies, NVidia is still innovating at an alarming rate. As long as they keep doing so, and their prices stay where they are (GeForce 2 for $130, anyone?), I'm happy.

    ------

  • I'm a huge Sega Saturn fan. I have lots of Saturn controllers. Does it seem likely to anyone that I could slap this board in my box as a second VGA controller, and then mostly use the controller port, and nothing else? I could always slap my 13" Mono VGA monitor on there if I absolutely HAD to have something on it.

    Also, while I'm on the subject (and slightly offtopic), anyone know if there's some other way to get Saturn controllers to work on a PC without having to do any soldering? (I know I could do it myself if I wanted to break out the iron. That's not what this is about.)

    The Saturn 3D Pad controller is perhaps the best non-vibrating game pad of all time. Analog joystick, digital pad, analog triggers, six buttons, and start. It also has a mode in which it emulates the normal saturn game pad. This controller kicks ass. I could put so much smack down in MW4 with that thing...


    --
    ALL YOUR KARMA ARE BELONG TO US

  • Nvidia a monopolist??? You have got to be kidding me. They out competed the competition by bring out superior products at competitive prices. That's why they are dominating right now. MS on the otherhand stifled competitors with superior products such as DRDOS. That's a real monopolist.
  • hehe, you can say that again.. I bought a geforce 2 gts card the other day for $200 and my graphics are /damn/ sweet..
    It'll take a few years for consoles to surpass this :P
    -since when did 'MTV' stand for Real World Television instead of MUSIC television?
  • A very engaging, objective view of the current industry leader. Few people seem to remember that even the original nVidia TNT was not a smashing success, let alone the revile heaped upon the Riva 128 by 3Dfx owners when it had no Glide support (!!!). Hopefully nVidia will continue with great execution and great products. If not, I think the graphics market is competitive enough that consumers will happily escort them the way of 3dfx..
    ---
  • by GrandCow ( 229565 ) on Friday February 09, 2001 @06:30PM (#442450)
    They are surprisingly simalar to Microsoft, except there is no equivalent Apple in the graphics field to provide a counterpoint.

    The difference here between Nvidia and microsoft is that Nvidia isn't using monopolistic tactics, and there are viable competitors on the same platform. Nvidia just has better products, and noone else has a better card. Would you say that the Radeon isn't a good card with a nice bit of market share? If you go to any Gateway store (I was there earlier today) and want one of the flat panel monitors, which card do you end up with? Not the Nvidia product, but the 32mb Radeon.

    Sorry, but it's really not fair to put them in the same category as Microsoft just because they have the best cards out there that people want.

  • by coupland ( 160334 ) <dchaseNO@SPAMhotmail.com> on Friday February 09, 2001 @06:30PM (#442451) Journal
    Uhm, sorry but are you dumb? 3dfx had a complete strangehold on the 3D market by locking down performance, price, and reputation. NVidia single-handedly dethroned them using superior products and a better R&D pipeline. You talk like NVidia has always been the big guy on the block when EVERYONE knows they fought a tremendous battle against an undisputed market leader. And won.
    ---
  • by WasterDave ( 20047 ) <davep@zedk[ ]com ['ep.' in gap]> on Friday February 09, 2001 @06:30PM (#442452)
    I had one, on a Diamond Edge2000. It shipped in a huge pile of hype about being:
    (a) The first Direct3D accelerator and
    (b) Compatible with the Sega Saturn.

    Both of which turned out to be bullshit. While it did ship some actually quite good DirectX drivers, the direct 3d aspect of it was being "worked on". They did eventually ship some, with a huge disclaimer along the lines of "we know these things blow chunks, it's all due to this quadratic surface thing". This was all in the middle of John Carmack's big Direct3d rant on usenet ("I am looking forward to doing an apples to apples comparison of Direct3D and OpenGL" turned into "I am not going to finish, and there will never be, a Direct3d port of quake").

    It also had onboard sound that was also very damn good. The MPU401 in particular was of near sound canvas quality. Unfortunately it was not sound blaster compatible and since the direct 3d port of quake was never going to happen, games basically stayed in DOS and sound effects had to be sent to a separate card.

    Saturn compatibility turned into "Sega will be porting games", which of course they never bothered to do because only eight people bought the cards.

    And Diamond were just shit about the whole thing. Haven't bought a thing from them since, don't know anyone else who has either.

    So in essence: Great silicon, some serious forward steps were taken; Shitty marketing, I guess they learned.

    Dave

    BTW, while we're on the subject of bullshit graphics accelerators, did anyone ever get texture mapping going on a Millennium2?
  • by Anonymous Coward on Friday February 09, 2001 @06:31PM (#442453)
    Tiling and the quadratic stuff are completely different. The quadratic stuff is to have a curved primitive. While that's a good thing, most surfaces you'll want to draw won't be quadratic, so you're just slightly better off than if you had triangles. There were also problems in how to do the texturing (well, actually, texturing wasn't too important back then so it didn't go into NV2.) Note that the new DirectX 8 has cubic splines in it, which ought to be better than quadratic primitives.

    The modern approach to drawing these curved primitives is to tesselate them into many triangles. If you had to draw a cubic spline using some quadratic primitive, you would have to "tesselate" your cubic primitive into smaller quadratic ones.

    Tiling is where you divide your screen into tiles (say, 16x16 pixels per tile). The hardware buffers all your glTriangles (or whatever), doesn't draw a thing until you glSwapBuffers (which would normally display the picture). At that point, it will sort all the triangles into the tiles (some triangles stradle several tiles). Then, each tile is drawn one after the other. The main advantage is that this per-tile work can all be done on-chip instead of doing it in frame buffer memory. Frame buffer bandwidth is one of the most limiting aspects of modern graphics chip design.

    The Kyro also uses a scan-line algorithm (if I understand correctly). In particular, it actually sort all the surfaces from front to back (or maybe back to front) to get transparencies correctly (or at least, its Sega incarnation in the DreamCast does this). Unfortunately, this great feature will run against 3d API's (both DirectX and OpenGL) which don't sit easily with that kind of business. Nevertheless, I do believe that such an extension will go into the API's in the near future; the developers are asking for it and the chip designers are trying to get it working.

    I design 3d chips for an important graphics company but because of my company's culture, I'd better stay anonymous and not even say whom I work for.

This is now. Later is later.

Working...