Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software

nVidia NV3x Sneak Peek 202

zoobaby writes "Here is a sneak peak at nVidia's upcoming line of cards. No hard specs, but some nice notes on changes from current NV2x to NV3x, also some very nice screenshots to show off what it will be capable of." In related news, Tim_F noticed that memory manufacturer Crucial is entering the video card business with their first card based on the ATI Radeon 8500le.
This discussion has been archived. No new comments can be posted.

nVidia NV3x Sneak Peek

Comments Filter:
  • by Anonymous Coward
    Most of us aren't Slashdot admins, after all.
    • by Anonymous Coward
      I was surprised when I clicked the link and got this:

      Windows cannot determine the validity of this certificate because it cannot locate a valid certificate revocation list from one or more of the certification authorities in the certification path.

      Also didn't know that ATI was owned by slashdot now. Hmmm.
  • by Smelly Jeffrey ( 583520 ) on Monday July 29, 2002 @10:45PM (#3976142) Homepage
    Wow, that's a nice picture of a motorcycle there. It's so well rendered that you can actually see that the designer forgot to render valve caps on the tire valves. Damn!
  • are awesome. I've not upgraded to a GF4 because of a promise ata 100 bios conflict with the chipset. Maybe its time to hope for some real competition in the graphics market :)
    • ...mean absolutely nothing, as ATi doesn't even have their own driver set out for Linux.

      I've heard some good things about DRI, but nobody using ATi hardware that I know has been able to tell me with a straight face that their card performs as well in Linux as it does in Windows like nVidia cards do.
  • by jerkychew ( 80913 )
    The word 'noticed' sends me to an admin page, and the phrase ATI sends me back to the home page. Please fix.
  • by Anonymous Coward
    Why is that link to the crucial memory story pointing to an HTTPS address on slashdot?

    And why is the cert. authority "Snakeoil"? Is this some sort of joke from the /. crew?
  • Voice of Reason (Score:2, Insightful)

    by Anonymous Coward
    I know, I know. You all can't wait to get your hands on the only component that really pushes computer hardware. However, I feel I have to remind you that these are only technical demos, and don't show, among other things, the true performance of the card. What's the point of being able to render objects with that level of realism if you're only pulling 1/2 frame a second to pull it off?
    • Excellent point. My early days with POV-RAY yielded some pretty comparable, if not better renderings. I always thought it would be cool to be able to render pov-code in realtime, but its just too cpu intensive. Its nice to see that 3D specs are finally starting to catch up with ray-trace quality.
  • Eye Candy (Score:3, Interesting)

    by galaga79 ( 307346 ) on Monday July 29, 2002 @10:51PM (#3976166) Homepage
    The eye candy is pretty damn amazing, especially that rendering taken from Final Fantasy: The Spirits Within movie. Just a couple of questions though, are those sort of graphics available in existing cards but game developers aren't fully taking advantage of the shaders? If this card intended for consumers/gamers?
    • I believe much of the quality comes from 64/128-bit floating point precision. I gathered as much from the ATI 9700 launch video, which was also quite impressive.
    • They took the pictures out - my god, first they want all the publicity they can get, then they get a cold turkey when it comes.
    • Re:Eye Candy (Score:3, Informative)

      by Graymalkin ( 13732 )
      Up until the GF3 you couldn't find a consumer card with programmable pixel shaders. Without prescise programmable shading done in hardware a good deal of effects just aren't possible. Current GF cards only support 64-bit integer lighting calculations which works fine if you want simple lighting but for some realism, high prescision floating point shader math is required. DirectX 9 and OpenGL 2 are both going to require floating point lighting calculations and thus hardware will need to support it as the R300 and NV30 do.

      If you used every feature of the GF2 or 3 you could get some really nice looking graphics. Whether you would get them running fast enough to play a deathmatch style game is the important question though. Developers can't just make a game for the GF4 and say everyone else can upgrade or else. Even the folks at id develop with hardware in mind that ought to be mainstream when their products are released. Quake 3 ran fine on the TNT2 and the original GeForce 256. Doom 3 is designed around the GF2/3 line of cards and their features.
      • Put another way, your 486 could have rendered a shot from Final Fantasy: TSW. The question is, how long would it take?

        • I don't see the correlation, the DX9 compliant hardware needs to support pretty huge shader programs that can be executed in a single pass. This keeps up the fillrate at the price of fancy effects. You could easily render parts of FF:TSW in realtime on current video cards. The NV30 and R300 are capable of doing Renderman quality shaders in a single pass which is PDQ compared to general purpose processors chugging away at them. Programmable pixel shaders make the big difference with 3D graphics as they can pull some realistic looking shots out of an otherwise lifeless image. Compare the Quake 1 engine to the Doom 3 engine. Doom 3 is using current cards to the extent of their abilities and spitting out graphics that look like CG shots from feature films. The in games from D3 look like the cut scenes of FF7.
    • The eye candy is pretty damn amazing, especially that rendering taken from Final Fantasy: The Spirits Within movie

      Hey, I can also render such scenes. Without a TNT2. The thing is that rendered scenes don't prove much. Not even if they're from Final Fantasy. Only thing they prove is:

      a) The CPU has worked on the scene for a while.
      b) The CPU worked a bit less since the gfx card did some of the work.

      Those scenes aren't proving much. Especially not the Final Fantasy one since it wasn't even made with an NV3x (!)
  • by faeryman ( 191366 ) on Monday July 29, 2002 @10:52PM (#3976168) Homepage
    Did anyone else's brain want to instinctively drink that cup of coffee on the 3rd page?
  • Before getting hyped too much, you should really read Carmack's comment. Any game that has true support of "CineFX" will need to be ran on at least a third-generation NV30.

    As much as I think NVIDIA is an honest, good company, I'll hold onto my GF4 Ti4400 until "NV30" 2.

    I'm no longer eating up tech demos. Even if they're damn impressive, they're far too future-oriented to seriously invest in.

  • "This is intended to allow real-time cinematic effects in real-time"

    Today is a great day in computing history: nVidia is the first to bring us "real-time cinematic effects" that actually occur in real time! I can't wait!
  • If you look at the features that are implimented in the NV 30 chipset, it is just about what the features for dx9 are. As for its capabilities are concerned, I really am excited to see that such technology will be used in games, but I am definately not buying that card, its price is well out of my range, I really do not like how nvidia's implementation of their linux drivers work. It seemed that after their 2802 drivers, their opengl didn't work too well at all with redhat linux, and then when I switched to gentoo recently, it doesn't work at all. So, as a result of that, I threw a Matrox G400 into my box, and will await the first company that creates linux drivers that fricking work. I don't care who it is, Ati or Matrox, or some other bastards, just get it done. Because it is obvious the way Xfree4.2 works, it is a real bitch for me to get my NV card to work, and since it is a GeForce2 mx, I really don't care if it does or not. I have a card that works with my games that I play on the PC, and for doom, or any other high end 3d game, I have an Xbox. So I can wait at least until someone gets decent drivers again.
    • You want a good driver from a vendor? Don't hold your breath. At least nVidia provides Linux drivers. Only reason I pick them over anyone else (that and the fact that, so long as Lord Carmack's backing them you know the important games'll always have a bias in favor of nVidia).

      What we need is a company that releases some open source drivers. Good luck on THAT happening.
  • It will be interesting to see how much of a foothold ATI's Radeon 9700 can get before nVidia's new card actually hits shelves. As this article [tomshardware.com] points out, nVidia has trumped ATI's latest graphics card almost immediately after it becomes available with an even more powerful one of their own. Do people think nVidia will see their pre-orders fewer in number when compared to those made before the releases of their past cards?
  • screenshots? (Score:3, Informative)

    by ywwg ( 20925 ) on Monday July 29, 2002 @10:57PM (#3976185) Homepage
    third page? screenshots? where are people looking?
  • Seems like the people over at sharkyextreme.com didnt think that the slashdot crowd were worthy of them lovely pics. Anyone happen to have a mirror
  • Screenshots (Score:3, Insightful)

    by ikekrull ( 59661 ) on Monday July 29, 2002 @11:01PM (#3976200) Homepage
    I would be rather surprised if those screenshots represented actual, realtime-rendered scenes.

    They look like they have been lifted directly off the ExLuna BMRT (kudos to Larry Gritz for a great renderer) gallery page.

    It may be that these are NV30 realtime scenes, with the BMRT Renderman shaders used in the BMRT renders ported to Cg, but it is also possible they are simply the BMRT-rendered examples, given to show what is possible using a shader-based rendering architecture.

    Anybody have any more info on whether these examples are actual realtime DirectX/OpenGL scenes?

    -Pete
    • by dmiller ( 581 )
      If you like BMRT, you may also like Aqsis [aqsis.com]. It is a GPL, Renderman compatible renderer.
    • In fact, the screenshots look indistibuishable from those listed on the BRMT page [exluna.com].
    • They look like they have been lifted directly off the ExLuna BMRT (kudos to Larry Gritz for a great renderer) gallery page.

      I wonder if that might be because nVidia recently bought [com.com] ExLuna...

  • http://www.crucial.com/store/listmfgr.asp?cat=Vide o+Card
  • by H3XA ( 590662 ) on Monday July 29, 2002 @11:01PM (#3976204)
    HardOCP - Crucial Response [hardocp.com]

    Since the R9000 has already been launched and is supposed to take the place of the 8500/LE, how long will Crucial produce this card?
    The length of time we'll sell this and any product is dependant on the market. Right now, the Crucial Radeon 8500LE is an excellent and economical option for anyone looking to improve their graphics capability.

    Is the Crucial VidCard made in the USA?
    The Micron DDR memory used in our Crucial Radeon 8500LE video card is manufactured in the USA. But the video card itself is assembled in Hong Kong.
    Astute [H]'er, Robin Schwartz, pointed out that the Crucial driver downloads page points to Sapphire Tech in Hong Kong, apparently the folks building the card.

    How much will it retail for?
    Currently, the Crucial Radeon 8500LE is available for $134.99 through Crucial.com and it comes with free shipping in the contiguous US.

    Will the 9000 chipset follow closely?
    We'll consider offering other video card options in the future. Whether we do depends on what our customers want and need.

    Where will is sell through?
    As with all our products, any new Crucial video cards would be available direct through our Web site at Crucial.com. We would also expect to offer new products through our European Web site at Crucial.com/UK. In fact, the Crucial Radeon 8500LE should be available through the UK site shortly.

    - HeXa
  • I was unable to find the screenshots on the review page.

    Does anybody know where they are? (Again really sorry for asking this. But you know the story is already sec^H^H^Hbroken).

  • I don't see any, just some lame Doom III logo.
  • If you consider a comparison table and a Doom 3 logo to be screenshots, then well...
  • by Christopher Thomas ( 11717 ) on Monday July 29, 2002 @11:14PM (#3976257)
    Ace's Hardware [aceshardware.com] also has a short but very informative article about the NV30.
  • PAGE 3 MIRROR! (Score:3, Informative)

    by Anonymous Coward on Monday July 29, 2002 @11:20PM (#3976273)
    here [webbschool.com].
  • That's a kick @@s doom III screen shot!
  • by Quarters ( 18322 ) on Monday July 29, 2002 @11:22PM (#3976281)
    "My current work on Doom is designed around what was made possible on the original GeForce, and reaches an optimal implementation on the NV30. My next generation of work will be designed around what is made possible on the NV30."
    The GF cards came out 1999ish (give or take). No matter how fast and furiously the hardware manufacturers pump out new silicon there is always a long adoption cycle for any new concepts. Game developers would be pretty thick headed to close out an installed base of X just to support a feature on Y (where Y is an extremely small value compared to X) cards.

    It doesn't matter how earth-shattering the NV30 will be. It's complete feature set won't be utilized anytime soon. The GF3/4 cards still has long lives ahead of them.
  • That's the killer. nVidia got to a programmable pipeline just before ATI and now they've gotten to 0.13 micron just before ATI. ATI is a half-step ahead of nVidia at this point. But as the article implies, nVidia has paid the overhead cost of improving their production. And that puts them in better shape for the future.

    Personally, buying an ATI is not even debateable until they put out Linux drivers. We'll see if the rumoured move to a unified driver architecture is true. So by my scorecard, ATI takes this round 10-9, but nVidia still leads by two rounds. (Judging by a 10-point must system, no standing 8-count, the fighter can't be saved by the bell in any round)

  • by Viking Coder ( 102287 ) on Monday July 29, 2002 @11:54PM (#3976369)
    Given this [webbschool.com] "nVidia rendered image" and this [exluna.com] BMRT rendered image, I see three possibilities.

    One - the guys at nVidia painstakingly translated each aspect of the original image to Cg.

    Two - the guys at nVidia have some technology that translates RenderMan to something they know how to render. It could be RenderMonkey-like technology. It could literally be RenderMonkey, with some nVidia back-end. It could be they contacted the original artist, John Monos, and took his original data and reformatted it (skipping RenderMan, entirely).

    Three - the images are a forgery.

    I'm betting on Three.
    • by Viking Coder ( 102287 ) on Tuesday July 30, 2002 @12:05AM (#3976396)
      Sorry, here's ALL the evidence that I found :

      BMRT chess [exluna.com] (by John Monos) vs. "nVidia chess" [webbschool.com]

      BMRT Bike [exluna.com] (by Don Kim) vs. "nVidia Bike" [webbschool.com]

      BMRT Table [exluna.com] (by Goran Kocov) vs. "nVidia Table" [webbschool.com]

      BMRT Markers [exluna.com] (by Rudy Poat) vs. "nVidia Markers" [webbschool.com]

      I believe I've pretty definitively shown that either they have an actual RenderMan renderer running on their hardware (and access to the original data by four different authors), or this is a fake.

      Sorry, I can't find the coffee cup or the Final Fantasy image. Maybe someone else can.
      • Found the coffee cup.

        BMRT coffee [exluna.com] (by Horvatth Szabolcs) vs. "nVidia coffee [webbschool.com]

        Given the news that nVidia bought Exluna [com.com], I suppose it IS possible that they rendered from original data. Hmph.

        I'd appreciate it if they fessed up and reported that they make a RenderMan renderer. I actually think that's bigger news than their exact hardware specs. It means "One interface to rule them all..."
        • by donglekey ( 124433 ) on Tuesday July 30, 2002 @12:52AM (#3976484) Homepage
          It is forgery. The BMRT images are so old there is little way they could dig up 5 or 6 year old images from someone who doesn't have any association with Exluna and that created them before there was an Exluna.

          More than that, the coffe cup is rendered with Entropy, not BMRT, it was done as all those images were, by someone else, this one recently in an image contest.

          The most obvious flaw though, is that those images are raytraced, and this is not something that anyone is claiming to do in realtime yet. It is beyond the scope of Nvidia's processor, as it should be. Those images are scaled duplicates that aren't changed a bit, and there is no way that an Nvidia card rendered them, because there is no way the reflections would be the same, but they are. Reflection maps have a tendancy to look correct, but not the same. There is also depth of field which is not impossible, but is improbable for now.
          • There is nothing about current shaders that restrict them to only raster graphics. The current shaders can act like like a ray tracer if you create the mesh such that there is 1 (or more) vertices per pixel. If you really wanted to you could make a primitive raytracer using current video card shaders.

            The most obvious flaw though, is that those images are raytraced, and this is not something that anyone is claiming to do in realtime yet

            I didn't read anywhere that those images were rendered in realtime. Either way I wouldnt be suprised if those claims were made.

            Those images are scaled duplicates that aren't changed a bit, and there is no way that an Nvidia card rendered them, because there is no way the reflections would be the same, but they are.

            If the nv3x uses full precision floating point operations, then there is no reason why a reflection map would look different. The algorithm is very standard. Env mapping looks horrible in some games because the hardware uses 'good enough' calculations with very tiny env maps and poor surface detail.

            There is also depth of field which is not impossible, but is improbable for now.

            Realtime DOF for games may be improbable, but not for rendering a single image. You can simulate DOF using multipass rendering as explained in the OpenGL redbook.
          • those images are raytraced, and this is not something that anyone is claiming to do in realtime yet.

            I beg to differ.

            I would have claimed the same before last week's Siggraph conference. But at that conference, I went to a panel discussion entitled something like "When will ray tracing replace rasterization?" The answer was "we'll do a hybrid approach instead". The first presenter showed an app (which was also running at RackSavers on the show floor) that was actually doing real time raytracing. It was rendering a conference room scene. You could dynamically change the viewpoint anywhere you like, move the furniture around, and it would even recompute a diffuse reflection solution progressively. Very impressive! He also showed another app that rendered the reflections of a car headlight at something like 5 fps.

            I would also suggest that you check out the paper that someone pointed out from Stanford. They have written a raytracer that uses the pixel shader of the nVidia hardware to render triangle-based scenes at interactive rates. Very impressive.

            I wouldn't discount those images as forgeries quite yet. With the new pixel shaders and vertex programs, the GPUs are rapidly becoming very versatile stream processors.
    • Four - nVidia acquires [com.com] exluna and uses a modified version of BMRT/Entropy to shade using nVidia hardware. They render images from previous image contests where the artists gave up ownership and had to submit the source files to prove the images were done with their software.

      I'm sure nVidia and exluna have been working together on this for a while.

      The guys at nVidia have some technology that translates RenderMan to something they know how to render.

      Yes, probably a compiler. RenderMan is just a language and can be implemented on whatever you wish.
    • It would be easier to tell if you ACTUALLY READ the caption rather than just look at the pictures.
      This image montage should be instantly recognizable, and is from the Final Fantasy: The Spirits Within movie. The NV3x architecture's CineFX engine is capable of rendering scenes of similar accuracy, as those found in computer generated films such as Final Fantasy.

      It says capable of rendering , see. That means the chip has the same rendering capability in its vertex shader as the high powered rendering engine that rendered these original pictures. It does not say they actually rendered this picture on this chip.

      Get it now....?

      • I think deception was the goal.

        It would be better if you added <sarcasm> tags to your message. =)
        • If deception is the problem, then accuse them of deception. Forgery has a more clear definition, and if these pictures have enough disclaimer, then they are not forgeries.

          The coffie cup, the knight, and the girl all have adiquate disclaimers. The motorcycle and tabletop both have language in them implying that they were rendered with Cg. Make your case there.

          Last time I used sarcasm tags, I got a compalint that I didn't really need them. It was "obvious". I guess you can't please everyone. Style is a personal choice...

          • "Enough disclaimer." Interesting concept.

            I think it's a reasonable expectation that rendered images in the context of a new piece of graphics hardware were actually produced using that new graphics hardware.

            I'm making the assertion that these specific images were instead produced by BMRT, and I'm even citing original images which can be compared with the images supposedly produced on the NV30. I think I've clearly demonstrated that there's a striking resemblance. It would have been far, far better if the article cited the source of the models, and further went on to detail the process by which the original scene data was instead rendered on the NV30. Not doing so leaves me with no choice other than to assume that the images were falsely produced (ie forgeries.)

            The responsibility to accurately and without question disclaim the images as NOT being actual new renderings produced by the hardware lies with the vendor (or the person presenting them). Since that was not uniformly done, I am accusing them of forgery.
            • I think it's a reasonable expectation that rendered images in the context of a new piece of graphics hardware were actually produced using that new graphics hardware.

              images supposedly produced on the NV30

              ...except that they did not ever claim that they produced them. You infered that from their placement on the page. You assumed that if a picture is on the page, it must be from their hardware. "Not doing so leaves me with no choice other than to assume". Your mistake, not theirs. You do have the choice, you just need to use it. You can blindly expect the world to play by your rules, or you can learn how it works and understand how to play by its rules.

              It sounds like the difference is between what YOU expect from a disclaimer and what a lawyer or court would consider a sufficient disclaimer.
              You seem to want some glairing statement like "Our hardware can draw pictures like this one, but we did not actually draw these exact pictures. These came from another fancy rendering program".

              A lawyer might simply say "Our hardware can draw pictures like this one." "Like" is a sufficient disclaimer.

              Of course it is meant as a deception. Unfortunately, deception is part of sales. If you really can't tell what it means, if you really can't see through the language, then you're gonna have a hard time out in the real world. You can complain all you want as to whether it's fair or right or whatever. The only rules that sales plays by are advertising laws (and sometimes not even those). The only way to call them on it is to take them to court. I don't thing there's a case here.

              • Look - I'm pointing out that these images were not produced by their hardware. That's NEWS to most of the people in this forum. Read the rest of the posts.

                "Of course it is meant as a deception." As a consumer, I like not to be deceived. Being deceived enrages me. Especially when there are supposed to be layers of "fact checkers" between me and the news that I read. Those fact checkers have failed, and I am doing what I can to educate the public about this FRAUD.

                "Like" is not a sufficient disclaimer, in the opinion of most engineers.

                I'm pointing out the difference between what is implicitly claimed, and what is technological truth. If you're going to get pissed at me for that, then you have to understand that out in the real world, there are engineers who debunk myths and falsehoods. I'm doing my job, and I'm trying to educate the /. readership. On who's behalf are you acting?

                "The only way to call them on it is to take them to court."

                Or to humiliate them in a public forum, such as this one. The law is not your only defense - it's the last defense. Public ridicule can be an amazingly valuable tool.
  • Those screen shots do look great, but they are absolutely worthless if you want to evaluate the card's performance in games, or animation, or whatever.

    Every halfway decent raytracing package can produce images of the same consummate quality (using only the cpu) at, say, one frame per minute. nVidia has yet to produce some proof that their new chip can even do that.

    Remember, all the renderings are with almost 100% certainty taken from a static model, i.e. no animation, no being busy with matrix translation. Now, what's the likelyhood that NV3x can actually render 25 of those in one second? Comparing 99's sneak peak screenshots with today's (or yesteryear's) games: Very Low.

    Hopefully, nVidia will provide a video clip of their creation in action sometime soon.

  • Nice link (Score:3, Interesting)

    by bogie ( 31020 ) on Tuesday July 30, 2002 @12:15AM (#3976426) Journal
    An https connection and a certificate which says:
    Issued by Snake Oil CA

    Issuer:
    E = ca@snakeoil.dom
    CN = Snake Oil CA
    OU = Certificate Authority
    O = Snake Oil, Ltd
    L = Snake Town
    S = Snake Desert
    C = XY

    Subject:

    E = brian@tangent.org
    CN = .slashdot.org
    OU = Slashdot
    O = Slashdot
    L = Nowhere
    S = Denial
    C = US

    Umm, yea sure I'll trust that.
    • Two things... (Score:2, Informative)

      by srvivn21 ( 410280 )
      1. It shows that you have never created a SSL certificate with OpenSSL. The snakeoil stuff is defaults. No big deal.

      2. Trust the certificate. Just don't send any information you want kept secret. It's just encrypting the request/reply, not installing anything on your computer.
      • Thanks for responding, I figured it might be some sort of default, but Snake Oil is not a very endearing thing to call a cert, that's what had me wondering. :-)
  • the flip side (Score:3, Insightful)

    by klocwerk ( 48514 ) on Tuesday July 30, 2002 @12:26AM (#3976447) Homepage
    Yes, the screen shots are gorgeous, but it's all just eye candy.
    Once we have hardware that can render realistic scenes and humans in real time, there's going to be a sudden realization that for all this prettiness, there's nothing behind it.
    imho, it's time we started really looking at interactive and reactive programming. Yes, AI research is a step in the right direction, also realtime english parsing stuff, but we need systems that can at least pretend to comprehend and react to realtime and infinitely variable human input.
    Imagine kings quest, with those graphics, and when you type something in it will understand it no matter what it says (short of l33t sp34k) and the game will react accordingly.

    Graphics are pretty, but with nothing behind it the graphics are just empty shells.
    • Yup, especially rendered *still* scenes are just "empty shells". I'm still wondering what they want to prove by showing off those scenes. I'm sure it could have been done before without too expensive cards. Don't you just let the renderer sit there and do its work? Perhaps the thing is that the NV3x does the work *faster* but a picture is pretty bad at illustrating speed. :-P

      Besides, that stuff with the Final Fantasy scene and something like "... and you can make scenes like this" just sound silly for pretty obvious reasons.

      Nah - show us some *in-game* scenes and/or real-time graphics to drool at instead of rendered spoons.
    • imho, it's time we started really looking at interactive and reactive programming. Yes, AI research is a step in the right direction, also realtime english parsing stuff, but we need systems that can at least pretend to comprehend and react to realtime and infinitely variable human input.

      And after that we'll work on turning lead into gold.

      Beyond very simple and well-defined contexts, natural speech parsing seems to require a solution to the Strong AI problem. Ditto a computer that responds intelligently to you outside a simple and well-defined context.

      I'll happily settle for better game AI, as that's about all we'll be getting for the next 30-50 years or so.
  • "This is intended to allow real-time cinematic effects in real-time"...and after the show, the nVidia staff went out for pints of beer filled with beer.
    <P>Is this the kind of writing we get when buzz-words collide?
  • by DarkHelmet ( 120004 ) <<ten.elcychtneves> <ta> <kram>> on Tuesday July 30, 2002 @12:59AM (#3976492) Homepage
    There is another sneak peak at firingsquad...

    http://firingsquad.gamers.com/hardware/cinefx/defa ult.asp [gamers.com]

    Joy.

    • Sharky claims that 64- and 128-bit bit depths will result in more color vibrancy. It won't. Pure red is always the same pure red, regardless of whether it's represented as (0xf,0,0) or (0xffff,0,0).

      What it will result in is fewer banding problems, particularly in areas where there's little color variation over a large area, such as fog. Such artifacts are more obvious in moving pictures such as movies or real-time 3D than they are in static images.

      • Red's a poor example because reds suck. The (OK, one) reason high-end video monitors cost so much is because their phosphors adhere to the standards. TVs and computer monitors have a certain tendency to use cheap phosphors and reds suffer the most.

        What I personally would like to see is a change from RGB to something with a bigger color gamut. Something that's not *smaller* than what the human eye can see. We've got monitors with pixels small enough our eyes can't tell the difference between one and two. We've got framerates so high we can't tell the difference between half and ful lframerate. But every color in every pixel of every frame is substandard. Entire classes of colors (royal purples, for example) cannot be expressed in RGB space. If you don't believe me, take a graphics class. You'll learn how incredible video cards really are and how substandard the colors they're working with really are.

        Actually, I just want FP colors so when I find a dark corner in an FPS and try to snipe at people, I don't get rocket-blasted by some asshole with the gamma jacked up sky-high who sees me a mile away.
  • This might be just me, but will the interface of the nv30 be 64 bit or 128 bit? A quad-pumped 64 bit interface isn't all that much different than a dual-pumped 128 bit interface.

    If the nv30's memory interface is only 64 bit, the main reason to wait for the card is its die shrink. DDR-II is a nonissue.

  • Re: (Score:2, Insightful)

    Comment removed based on user account deletion
  • ...since I'm not sure I'd want to do with a mountaintop anyway.

    ./ readers: we're techno-chic and brilliant, we can install the latest distro of Linux on a Ti99 pocket calculator, and can wax eloquently on the DMCA and it's ramifications on free speech; however, we simply cannot spell better than a third-grader.

  • Man, I saw this story right when it was posted last night, and the first thing I said was "Smells Like FUD". I didn't post it then because I was sure someone else would but anyway...nVidia just got their ass kicked by ATI, so they release some "really new screenshots" that are supposed to make everyone be amazed at what their new board is supposed to be able to do. No word on how these are rendered, what's going on, or why they're releasing screenshots with no comparison with another board. The subtle message that everyone on here has missed is: Don't buy ATI, wait for nVidia.
  • I am thinking of getting a dual AMD box. I want to get a decent video card that I can play games and give me nice 2D images. Something under $200. I am leaning towards NVidia, as they are 'linux friendly'. What card ./ers using?

    thanks
  • ...because there are no NV30s yet. It was confirmed in an NVDA conference call yesterday that NV30 has not taped out yet.

    All of these previews are just PR leaks to distract from the Radeon 9700 launch. Assuming NV30 tapes out today, Nvidia will be very very hard pressed to get a card in stores by Christmas Day. They have already missed the Xmas season.

    Having said that, the NV30 will be quite amazing, and (from what we know of it) should best the also-amazing Radeon 9700 by quite a bit. To be more specific: it should be better for non-realtime hardware rendering of scenes that are currently rendered in software--like those Exluna pics that were shipped out in their PR--because it has more flexible shaders (we dunno if they're faster too, but this is also likely). Yes, it will be able to render those images, in "near-realtime", though certainly not actual realtime. It should offer better texel fillrate, especially in multitexturing situations, because it has an 8x2 pipeline organization instead of 8x1 like the Radeon, and because as a .13um card it *should* achieve higher clock rates. Performance in bandwidth-bound situations depends on whether it uses 128-bit or 256-bit DDR-II; if the former, then it will have ~20% less bandwidth than the 256-bit DDR Radeon 9700, if the latter, it will have a hell of a lot more. AA/aniso performance depends both on that bandwidth question and on the particulars of the AA/aniso hardware. The Radeon's is really really good (which is why its lead over Ti4600 becomes so dramatic in AA/aniso situations), but NV30s could be that good too.

    But Nvidia is desperately late with the card, and by the time they get it out ATi may have a successor to the Radeon 9700 (perhaps .13um and DDR-II) very close at hand. This is the first time in a long time that Nvidia has screwed up big time like this (in fairness, it seems more the fault of their fab partner TSMC), and it comes just as ATi is releasing a truly excellent product. Again, NV30 will still be an awesome card when its released, despite the delay, but leaking info about it now is just a way to string people along on a wait which will be longer than most realize.

    And these "previews" are nothing more than rehashes of Nvidia PR pdf's; they are vague not because sharkyextreme performed any difficult investigation, but because they are simply regurgitating teaser PR for a card which doesn't even exist yet.

"If there isn't a population problem, why is the government putting cancer in the cigarettes?" -- the elder Steptoe, c. 1970

Working...