nVidia NV3x Sneak Peek 202
zoobaby writes "Here is a sneak peak at nVidia's upcoming line of cards. No hard specs, but some nice notes on changes from current NV2x to NV3x, also some very nice screenshots to show off what it will be capable of." In related news, Tim_F noticed that memory manufacturer Crucial is entering the video card business with their first card based on the ATI Radeon 8500le.
You may want to fix that submit.pl link (Score:1, Insightful)
Re:You may want to fix that submit.pl link (Score:1, Informative)
Windows cannot determine the validity of this certificate because it cannot locate a valid certificate revocation list from one or more of the certification authorities in the certification path.
Also didn't know that ATI was owned by slashdot now. Hmmm.
Nice Screenshots (Score:3, Funny)
The specs on the ATI 9700 (Score:2)
Re:The specs on the ATI 9700 (Score:3, Interesting)
I've heard some good things about DRI, but nobody using ATi hardware that I know has been able to tell me with a straight face that their card performs as well in Linux as it does in Windows like nVidia cards do.
Re:The specs on the ATI 9700 (Score:2)
If someone loads their nvidia drivers in Windows, are they not still running Windows?
siri
Re:The specs on the ATI 9700 (Score:2)
Promise has a history of problems with video cards (e.g. look at some real-time video editing cards.... promise is listed as a problem child for most of them)
Re:The specs on the ATI 9700 (Score:2)
Fix the links, please (Score:2, Informative)
Re:Fix the links, please (Score:3, Funny)
Re:Fix the links, please (Score:2)
For the love of God (Score:2)
Re: That single hyperlink... (Score:2)
And here it is, still alive since 1994! [pixelscapes.com]
Re:Fix the links, please (Score:5, Informative)
Re:Fix the links, please (Score:2, Insightful)
Re:Fix the links, please (Score:2)
Why do I get the feeling that the moderation of this comment up to +4 informitive is a horrible horrible sign that no one even tries to read the articles anymore..?
Re:Fix the links, please (Score:4, Funny)
Forget to check that "post anonymously" box, huh?
Re:Fix the links, please (Score:2)
Re:Fix the links, please (Score:3, Funny)
"huh?? What, you feel like some nut once in a while? If it hurts then maybe your are a very bad person and will get to pound me in the ass."
Sicko.
Dumbass? (Score:1)
Perhaps you should take a long look in a mirror, and re-evaluate who the 'dumbass' in need of a 'clue' actually is, here.
Admittedly OT, but no better place for it. (Score:1, Offtopic)
And why is the cert. authority "Snakeoil"? Is this some sort of joke from the
Re:Admittedly OT, but no better place for it. (Score:1)
Snakeoil is the CA for the test certificate that comes with mod_ssl. The joke is from the Apache folks.
Re:Admittedly OT, but no better place for it. (Score:2)
Phillip.
Re:Admittedly OT, but no better place for it. (Score:2)
So... we don't read the articles, and Slashdot admins don't read the directions.
HARD TO BELIEVE!
Voice of Reason (Score:2, Insightful)
Re:Voice of Reason (Score:1)
Eye Candy (Score:3, Interesting)
Re:Eye Candy (Score:1)
Re:Eye Candy - slashdotted (Score:1)
Re:Eye Candy (Score:3, Informative)
If you used every feature of the GF2 or 3 you could get some really nice looking graphics. Whether you would get them running fast enough to play a deathmatch style game is the important question though. Developers can't just make a game for the GF4 and say everyone else can upgrade or else. Even the folks at id develop with hardware in mind that ought to be mainstream when their products are released. Quake 3 ran fine on the TNT2 and the original GeForce 256. Doom 3 is designed around the GF2/3 line of cards and their features.
Re:Eye Candy (Score:2)
Put another way, your 486 could have rendered a shot from Final Fantasy: TSW. The question is, how long would it take?
Re:Eye Candy (Score:2)
Re:Eye Candy (Score:2)
Hey, I can also render such scenes. Without a TNT2. The thing is that rendered scenes don't prove much. Not even if they're from Final Fantasy. Only thing they prove is:
a) The CPU has worked on the scene for a while.
b) The CPU worked a bit less since the gfx card did some of the work.
Those scenes aren't proving much. Especially not the Final Fantasy one since it wasn't even made with an NV3x (!)
pretty cool screenshots (Score:3, Insightful)
Re:pretty cool screenshots (Score:2, Insightful)
Re:pretty cool screenshots (Score:1)
Re:pretty cool screenshots (Score:1)
Re:pretty cool screenshots (Score:2, Funny)
3-way, the only way
Screenshot mirror (Score:2, Interesting)
NvNews [nvnews.net]
Re:pretty cool screenshots (Score:1)
I would mirror it but I have no webspace, sorry
Keep in mind... (Score:1)
As much as I think NVIDIA is an honest, good company, I'll hold onto my GF4 Ti4400 until "NV30" 2.
I'm no longer eating up tech demos. Even if they're damn impressive, they're far too future-oriented to seriously invest in.
Re:Keep in mind... (Score:2)
Real time effects (Score:2, Funny)
Today is a great day in computing history: nVidia is the first to bring us "real-time cinematic effects" that actually occur in real time! I can't wait!
Nice to see... (Score:1)
Basically, a directx 9 part (Score:2, Informative)
Re:Basically, a directx 9 part (Score:1)
What we need is a company that releases some open source drivers. Good luck on THAT happening.
Time vs. Radeon 9700 (Score:2, Interesting)
screenshots? (Score:3, Informative)
Re:screenshots? (Score:3, Informative)
MIRROR NEEDED! (Score:1)
Re:MIRROR NEEDED! (Score:3, Informative)
NVIDIA NV30 Sneak Preview [nvmax.com]
Some Beyond3d forum discussion as well as screenshots and more info on the NV30.
NV30 Screenshots [beyond3d.com]
One more link.... to nV News with further NV30 details
nV News [nvnews.net]
- HeXa
Re:MIRROR NEEDED! (Score:1)
Screenshots (Score:3, Insightful)
They look like they have been lifted directly off the ExLuna BMRT (kudos to Larry Gritz for a great renderer) gallery page.
It may be that these are NV30 realtime scenes, with the BMRT Renderman shaders used in the BMRT renders ported to Cg, but it is also possible they are simply the BMRT-rendered examples, given to show what is possible using a shader-based rendering architecture.
Anybody have any more info on whether these examples are actual realtime DirectX/OpenGL scenes?
-Pete
Aqsis (Score:2)
Re:Aqsis (Score:2, Interesting)
Pixar has successfully prevented ExLuna from selling their core rendering product, a cutting-edge renderer based on the renderman standard, and also made them yank BMRT, by suing them for patent infringement, and threatening them with all sorts of other nasty legal stuff. (ExLuna is/was a company run by Larry Gritz, amongst other ex-Pixarites, and Larry was the author of BMRT.)
You won't see BMRT again.
NVIDIA has since bought ExLuna for their personnel and expertise.
Yay Pixar. It's amazing how far they've taken that bogus distributed sampling patent.
A.
Re:Screenshots (Score:2)
Because nVidia bought them? (Score:2)
I wonder if that might be because nVidia recently bought [com.com] ExLuna...
Crucial's Radeon Link (Score:1)
More info on the Crucial 8500LE card (Score:3, Informative)
Since the R9000 has already been launched and is supposed to take the place of the 8500/LE, how long will Crucial produce this card?
The length of time we'll sell this and any product is dependant on the market. Right now, the Crucial Radeon 8500LE is an excellent and economical option for anyone looking to improve their graphics capability.
Is the Crucial VidCard made in the USA?
The Micron DDR memory used in our Crucial Radeon 8500LE video card is manufactured in the USA. But the video card itself is assembled in Hong Kong.
Astute [H]'er, Robin Schwartz, pointed out that the Crucial driver downloads page points to Sapphire Tech in Hong Kong, apparently the folks building the card.
How much will it retail for?
Currently, the Crucial Radeon 8500LE is available for $134.99 through Crucial.com and it comes with free shipping in the contiguous US.
Will the 9000 chipset follow closely?
We'll consider offering other video card options in the future. Whether we do depends on what our customers want and need.
Where will is sell through?
As with all our products, any new Crucial video cards would be available direct through our Web site at Crucial.com. We would also expect to offer new products through our European Web site at Crucial.com/UK. In fact, the Crucial Radeon 8500LE should be available through the UK site shortly.
- HeXa
Sorry for asking but... (Score:1)
Does anybody know where they are? (Again really sorry for asking this. But you know the story is already sec^H^H^Hbroken).
What screenshots??? (Score:2)
Screenshots ? (Score:2)
Ace's Hardware also has a preview. (Score:3, Informative)
PAGE 3 MIRROR! (Score:3, Informative)
screenshots (Score:1)
So, 3. 5 years it'll be necessary (Score:4, Interesting)
It doesn't matter how earth-shattering the NV30 will be. It's complete feature set won't be utilized anytime soon. The GF3/4 cards still has long lives ahead of them.
0.13 Micron (Score:1)
Personally, buying an ATI is not even debateable until they put out Linux drivers. We'll see if the rumoured move to a unified driver architecture is true. So by my scorecard, ATI takes this round 10-9, but nVidia still leads by two rounds. (Judging by a 10-point must system, no standing 8-count, the fighter can't be saved by the bell in any round)
Thinking it's a forgery (Score:4, Interesting)
One - the guys at nVidia painstakingly translated each aspect of the original image to Cg.
Two - the guys at nVidia have some technology that translates RenderMan to something they know how to render. It could be RenderMonkey-like technology. It could literally be RenderMonkey, with some nVidia back-end. It could be they contacted the original artist, John Monos, and took his original data and reformatted it (skipping RenderMan, entirely).
Three - the images are a forgery.
I'm betting on Three.
Re:Thinking it's a forgery (Score:4, Informative)
BMRT chess [exluna.com] (by John Monos) vs. "nVidia chess" [webbschool.com]
BMRT Bike [exluna.com] (by Don Kim) vs. "nVidia Bike" [webbschool.com]
BMRT Table [exluna.com] (by Goran Kocov) vs. "nVidia Table" [webbschool.com]
BMRT Markers [exluna.com] (by Rudy Poat) vs. "nVidia Markers" [webbschool.com]
I believe I've pretty definitively shown that either they have an actual RenderMan renderer running on their hardware (and access to the original data by four different authors), or this is a fake.
Sorry, I can't find the coffee cup or the Final Fantasy image. Maybe someone else can.
Re:Thinking it's a forgery (Score:2)
BMRT coffee [exluna.com] (by Horvatth Szabolcs) vs. "nVidia coffee [webbschool.com]
Given the news that nVidia bought Exluna [com.com], I suppose it IS possible that they rendered from original data. Hmph.
I'd appreciate it if they fessed up and reported that they make a RenderMan renderer. I actually think that's bigger news than their exact hardware specs. It means "One interface to rule them all..."
Re:Thinking it's a forgery (Score:5, Informative)
More than that, the coffe cup is rendered with Entropy, not BMRT, it was done as all those images were, by someone else, this one recently in an image contest.
The most obvious flaw though, is that those images are raytraced, and this is not something that anyone is claiming to do in realtime yet. It is beyond the scope of Nvidia's processor, as it should be. Those images are scaled duplicates that aren't changed a bit, and there is no way that an Nvidia card rendered them, because there is no way the reflections would be the same, but they are. Reflection maps have a tendancy to look correct, but not the same. There is also depth of field which is not impossible, but is improbable for now.
Re:Thinking it's a forgery (Score:2)
The most obvious flaw though, is that those images are raytraced, and this is not something that anyone is claiming to do in realtime yet
I didn't read anywhere that those images were rendered in realtime. Either way I wouldnt be suprised if those claims were made.
Those images are scaled duplicates that aren't changed a bit, and there is no way that an Nvidia card rendered them, because there is no way the reflections would be the same, but they are.
If the nv3x uses full precision floating point operations, then there is no reason why a reflection map would look different. The algorithm is very standard. Env mapping looks horrible in some games because the hardware uses 'good enough' calculations with very tiny env maps and poor surface detail.
There is also depth of field which is not impossible, but is improbable for now.
Realtime DOF for games may be improbable, but not for rendering a single image. You can simulate DOF using multipass rendering as explained in the OpenGL redbook.
Real-time raytracing (Score:2)
I beg to differ.
I would have claimed the same before last week's Siggraph conference. But at that conference, I went to a panel discussion entitled something like "When will ray tracing replace rasterization?" The answer was "we'll do a hybrid approach instead". The first presenter showed an app (which was also running at RackSavers on the show floor) that was actually doing real time raytracing. It was rendering a conference room scene. You could dynamically change the viewpoint anywhere you like, move the furniture around, and it would even recompute a diffuse reflection solution progressively. Very impressive! He also showed another app that rendered the reflections of a car headlight at something like 5 fps.
I would also suggest that you check out the paper that someone pointed out from Stanford. They have written a raytracer that uses the pixel shader of the nVidia hardware to render triangle-based scenes at interactive rates. Very impressive.
I wouldn't discount those images as forgeries quite yet. With the new pixel shaders and vertex programs, the GPUs are rapidly becoming very versatile stream processors.
Re:Thinking it's a forgery (Score:2)
I'm sure nVidia and exluna have been working together on this for a while.
The guys at nVidia have some technology that translates RenderMan to something they know how to render.
Yes, probably a compiler. RenderMan is just a language and can be implemented on whatever you wish.
Congratulations, it's a forgery! (Score:2)
It says capable of rendering , see. That means the chip has the same rendering capability in its vertex shader as the high powered rendering engine that rendered these original pictures. It does not say they actually rendered this picture on this chip.
Get it now....?
Re:Congratulations, it's a forgery! (Score:2)
It would be better if you added <sarcasm> tags to your message. =)
Re:Congratulations, it's a forgery! (Score:2)
The coffie cup, the knight, and the girl all have adiquate disclaimers. The motorcycle and tabletop both have language in them implying that they were rendered with Cg. Make your case there.
Last time I used sarcasm tags, I got a compalint that I didn't really need them. It was "obvious". I guess you can't please everyone. Style is a personal choice...
Re:Congratulations, it's a forgery! (Score:2)
I think it's a reasonable expectation that rendered images in the context of a new piece of graphics hardware were actually produced using that new graphics hardware.
I'm making the assertion that these specific images were instead produced by BMRT, and I'm even citing original images which can be compared with the images supposedly produced on the NV30. I think I've clearly demonstrated that there's a striking resemblance. It would have been far, far better if the article cited the source of the models, and further went on to detail the process by which the original scene data was instead rendered on the NV30. Not doing so leaves me with no choice other than to assume that the images were falsely produced (ie forgeries.)
The responsibility to accurately and without question disclaim the images as NOT being actual new renderings produced by the hardware lies with the vendor (or the person presenting them). Since that was not uniformly done, I am accusing them of forgery.
Re:Congratulations, it's a forgery! (Score:2)
images supposedly produced on the NV30
It sounds like the difference is between what YOU expect from a disclaimer and what a lawyer or court would consider a sufficient disclaimer.
You seem to want some glairing statement like "Our hardware can draw pictures like this one, but we did not actually draw these exact pictures. These came from another fancy rendering program".
A lawyer might simply say "Our hardware can draw pictures like this one." "Like" is a sufficient disclaimer.
Of course it is meant as a deception. Unfortunately, deception is part of sales. If you really can't tell what it means, if you really can't see through the language, then you're gonna have a hard time out in the real world. You can complain all you want as to whether it's fair or right or whatever. The only rules that sales plays by are advertising laws (and sometimes not even those). The only way to call them on it is to take them to court. I don't thing there's a case here.
Re:Congratulations, it's a forgery! (Score:2)
"Of course it is meant as a deception." As a consumer, I like not to be deceived. Being deceived enrages me. Especially when there are supposed to be layers of "fact checkers" between me and the news that I read. Those fact checkers have failed, and I am doing what I can to educate the public about this FRAUD.
"Like" is not a sufficient disclaimer, in the opinion of most engineers.
I'm pointing out the difference between what is implicitly claimed, and what is technological truth. If you're going to get pissed at me for that, then you have to understand that out in the real world, there are engineers who debunk myths and falsehoods. I'm doing my job, and I'm trying to educate the
"The only way to call them on it is to take them to court."
Or to humiliate them in a public forum, such as this one. The law is not your only defense - it's the last defense. Public ridicule can be an amazingly valuable tool.
Re:Congratulations, it's a forgery! (Score:2)
Obviously, I wasn't. Otherwise I wouldn't have pointed out that the images were NOT actually generated by the new hardware. Read all of the other posts in this article - most people WERE deceived. Yell at them, not at me.
"You should pay a little better attention to what you are reading."
Back at you.
"Pay attention to the details you read"
The burden of clarity rests on the publisher of news articles. When an article is UNCLEAR, this kind of forum is a perfect place to attack it. Why are you so eager to criticize ME? When I'M one of the FEW people who actually understood what these images are?
"you ASSUME"
You assumed I was trying to hold them to some legal definitions and business practices, and to somehow build a legal case against them. I couldn't care less about that. Bad assumption on your part. I was just publicly flogging them for their deceitful practices. I take it you yourself have never been deceived? I guess you should cancel your subscription to Consumer's Digest, huh?
Back to the coder analogy, the comments at the top of this source file sucked. They were unclear, and I'm trying to clear them up. If I *could* edit the page to say "THESE IMAGES NOT CREATED ON THE NV30" then I would. Since I can't, I'm doing what I can to make sure people realize that, here. Again, why does this somehow earn me your criticism?
I don't EXPECT any kind of behavior from vendors and news agencies. I don't EXPECT the law to protect me. But I'm glad that I can publicly criticize their deceitful practices. Repetition is boring, but - WHY ARE YOU YELLING AT ME? YELL AT THE IDIOTS WHO FELL FOR IT. And maybe, just maybe, join me in yelling at them for being deceitful, okay?
Looks a lot like 0.001fps (Score:2, Insightful)
Every halfway decent raytracing package can produce images of the same consummate quality (using only the cpu) at, say, one frame per minute. nVidia has yet to produce some proof that their new chip can even do that.
Remember, all the renderings are with almost 100% certainty taken from a static model, i.e. no animation, no being busy with matrix translation. Now, what's the likelyhood that NV3x can actually render 25 of those in one second? Comparing 99's sneak peak screenshots with today's (or yesteryear's) games: Very Low.
Hopefully, nVidia will provide a video clip of their creation in action sometime soon.
Nice link (Score:3, Interesting)
Issued by Snake Oil CA
Issuer:
E = ca@snakeoil.dom
CN = Snake Oil CA
OU = Certificate Authority
O = Snake Oil, Ltd
L = Snake Town
S = Snake Desert
C = XY
Subject:
E = brian@tangent.org
CN =
OU = Slashdot
O = Slashdot
L = Nowhere
S = Denial
C = US
Umm, yea sure I'll trust that.
Two things... (Score:2, Informative)
2. Trust the certificate. Just don't send any information you want kept secret. It's just encrypting the request/reply, not installing anything on your computer.
Re:Two things... (Score:2)
the flip side (Score:3, Insightful)
Once we have hardware that can render realistic scenes and humans in real time, there's going to be a sudden realization that for all this prettiness, there's nothing behind it.
imho, it's time we started really looking at interactive and reactive programming. Yes, AI research is a step in the right direction, also realtime english parsing stuff, but we need systems that can at least pretend to comprehend and react to realtime and infinitely variable human input.
Imagine kings quest, with those graphics, and when you type something in it will understand it no matter what it says (short of l33t sp34k) and the game will react accordingly.
Graphics are pretty, but with nothing behind it the graphics are just empty shells.
Re:the flip side (Score:2)
Besides, that stuff with the Final Fantasy scene and something like "... and you can make scenes like this" just sound silly for pretty obvious reasons.
Nah - show us some *in-game* scenes and/or real-time graphics to drool at instead of rendered spoons.
Re:the flip side (Score:2)
And after that we'll work on turning lead into gold.
Beyond very simple and well-defined contexts, natural speech parsing seems to require a solution to the Strong AI problem. Ditto a computer that responds intelligently to you outside a simple and well-defined context.
I'll happily settle for better game AI, as that's about all we'll be getting for the next 30-50 years or so.
Somebody needs an editor (Score:2, Funny)
<P>Is this the kind of writing we get when buzz-words collide?
FiringSquad nv30 article (Score:4, Informative)
http://firingsquad.gamers.com/hardware/cinefx/defa ult.asp [gamers.com]
Joy.
Re:FiringSquad nv30 article (Score:3, Insightful)
What it will result in is fewer banding problems, particularly in areas where there's little color variation over a large area, such as fog. Such artifacts are more obvious in moving pictures such as movies or real-time 3D than they are in static images.
Re:FiringSquad nv30 article (Score:2)
What I personally would like to see is a change from RGB to something with a bigger color gamut. Something that's not *smaller* than what the human eye can see. We've got monitors with pixels small enough our eyes can't tell the difference between one and two. We've got framerates so high we can't tell the difference between half and ful lframerate. But every color in every pixel of every frame is substandard. Entire classes of colors (royal purples, for example) cannot be expressed in RGB space. If you don't believe me, take a graphics class. You'll learn how incredible video cards really are and how substandard the colors they're working with really are.
Actually, I just want FP colors so when I find a dark corner in an FPS and try to snipe at people, I don't get rocket-blasted by some asshole with the gamma jacked up sky-high who sees me a mile away.
DDR-ii (Score:2)
If the nv30's memory interface is only 64 bit, the main reason to wait for the card is its die shrink. DDR-II is a nonissue.
Re:DDR-ii (Score:2)
http://www.geek.com/news/geeknews/2001july/bch2001 0709006702.htm
This is what I found claiming it's Quad Pumped. I'm not saying the article is right, but I'm just looking for an article that claims otherwise.
Thanks!
Re: (Score:2, Insightful)
personally, I'd prefer a sneak PEEK (Score:2)
Smells Like FUD (Score:2)
card recommendations.. (Score:2)
thanks
Images *NOT* rendered on an NV30 (Score:2)
All of these previews are just PR leaks to distract from the Radeon 9700 launch. Assuming NV30 tapes out today, Nvidia will be very very hard pressed to get a card in stores by Christmas Day. They have already missed the Xmas season.
Having said that, the NV30 will be quite amazing, and (from what we know of it) should best the also-amazing Radeon 9700 by quite a bit. To be more specific: it should be better for non-realtime hardware rendering of scenes that are currently rendered in software--like those Exluna pics that were shipped out in their PR--because it has more flexible shaders (we dunno if they're faster too, but this is also likely). Yes, it will be able to render those images, in "near-realtime", though certainly not actual realtime. It should offer better texel fillrate, especially in multitexturing situations, because it has an 8x2 pipeline organization instead of 8x1 like the Radeon, and because as a
But Nvidia is desperately late with the card, and by the time they get it out ATi may have a successor to the Radeon 9700 (perhaps
And these "previews" are nothing more than rehashes of Nvidia PR pdf's; they are vague not because sharkyextreme performed any difficult investigation, but because they are simply regurgitating teaser PR for a card which doesn't even exist yet.
Re:Mobile chipset? (Score:2)