The Future According To nVidia 132
NerdMaster writes "Last week nVidia held their Spring 2008 Editor's day, where they presented their forthcoming series of graphics processing units. While the folks at Hardware Secrets couldn't tell the details of the new chips, they posted some ideas of what nVidia is seeing as the future of computing. Basically more GPGPU usage, with the system CPU losing its importance, and the co-existence of ray-tracing and rasterization on future video cards and games. In other words, the 'can of whoop-ass' nVidia has promised to open on Intel."
Who will have the better Linux driver support? (Score:5, Informative)
Re:Who will have the better Linux driver support? (Score:4, Insightful)
So no, the post isn't redundant, because this issue isn't yet solved (not to mention, how can a first post be redundant?).
Re: (Score:3, Insightful)
Considering how many problems I have always seen, I would say that even on Windows it is anything but trivial.
Video drivers suck. On whatever platform you choose.
Re: (Score:2)
Re: (Score:1)
I fail to see how this is redundant. I too choose video cards based on how well they are supported under Linux. Or rather, I choose the ones with the less shitty support. Any Linux users who's ever tried to use any OpenGL app more complex than glxgears knows the pain, so I reckon Linux (or any OS other than Windows I suppose) support isn't a trivial, or a fanboy issue. So no, the post isn't redundant, because this issue isn't yet solved (not to mention, how can a first post be redundant?).
Hmmm I do too linux support is my sole criterion for buying any hardware
Re:Who will have the better Linux driver support? (Score:5, Insightful)
Intel is going the Open Source road, trying to be as open as possible. Unfortunately, from a performance PoV their hardware sucks. Their products are intended as consumer-level, chipset integrated solutions and, considering that, work nicely. Don't try any 3D games, though.
ATi opened a lot of specs, so community-developed and completely open drivers are on the horizon. Unfortunately the horizon is quite far away and the movement towards it is similar to a kid on a tricycle. The situation is prone to improve though. Performance-wise, ATi may be a good choice if you'd like to play the occasional game, but they don't really compare to nVidia (which is unlikely to change soon).
In the end, I'm going to stick to nVidia in the near future, using intel wherever low energy consumption is strongly desired (i.e. notebooks and similar). ATi just ain't my cup of tea, I wouldn't be putting a red card in a Windows box either, but my preference of nVintel is just such -- a preference. Go with whatever suits you best.
Re: (Score:1)
Re: (Score:2)
Re: (Score:2, Interesting)
Purists rage against it because it's against freedom and-so-on, pragmatists tend to like the full 3D acceleration that comes with it.
Bullshit.
Closed drivers suck for pragmatic reasons.
Just because YOU haven't paid the price yet doesn't mean it isn't true.
I bought two top-end nvidia cards (spent $350+ each on them) only to find out that because my monitors don't send EDID information their binary-blob drivers wouldn't work. The problem was that my monitors required dual-link DVI and even though these top-of-the-line cards had dual-link transceivers built into the chip (i.e. every single card of that generation had dual-link transceivers
Re: (Score:2)
Though, in defense of nVidia your problem does seem rather unique. Until very recently it was my understanding that most any screen made in the past decade ought to provide an EDID -- the standard's fifteen b
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Re:Who will have the better Linux driver support? (Score:5, Funny)
I'm the CEO of NVidia and I spend all day reading slashdot. Despite that I hadn't noticed that Linux was popular until I read your post.
I'll tell the driver developers to start fixing the drivers now.
Thanks for the heads up
Jen-Hsun Huang
CEO, NVidia Inc
Re:Who will have the better Linux driver support? (Score:5, Informative)
And those drivers would actually be better. Better Linux support for less money.
So what's the holdup?
Re: (Score:1)
Re: (Score:3, Insightful)
No, the real reason very likely has to do with the geForce/Quadro scam. Specifically, the fact that you can take a geForce (typically, what, $200?) and soft-mod it into a Quadro (at least $500, and most are $1k and up).
Re: (Score:1)
I doubt very much that it's either of these. Remember, we only need specs for an interface, it doesn't have to be schematics for the whole card.
Well if you had the register specs you could get a bunch of Chinese VHDL hackers to make a compatible card. Actually I suspect that most hardware has an 'obvious' implementation from the register spec, and that obvious implementation is rather good. And example would be ARM processors.
Ok x86 implementations these days are seriously non obvious. But I'd bet that graphics cards are more like an ARM than an x86. And that is why they don't want to release the spec.
No, the real reason very likely has to do with the geForce/Quadro scam. Specifically, the fact that you can take a geForce (typically, what, $200?) and soft-mod it into a Quadro (at least $500, and most are $1k and up).
Well that's another reason. They're also prob
Re: (Score:2)
Well if you had the register specs you could get a bunch of Chinese VHDL hackers to make a compatible card.
Maybe I'm missing just how crucial "register specs" are, but we already have something like that -- we already have an API spec. Two, at least. It would now take Chinese VHDL and software hackers to do it, but it could be done.
They're also probably worried that someone would sue them for patent infringement if the released specs allowed ATI to find something.
Possibly, but they could check that themselves -- after all, ATI has released specs.
Re: (Score:1)
Maybe I'm missing just how crucial "register specs" are, but we already have something like that -- we already have an API spec. Two, at least. It would now take Chinese VHDL and software hackers to do it, but it could be done.
By API you mean DirectX, right? DirectX can be implemented in a lot of ways, some fast some slow. The register level spec would be something like
"Register at base address+0x1010 is a command register. Write these commands to draw these polygons"
Someone at NVidia said "The register spec if very neat. Essentially we do object orientation in hardware".
Which is intriguing. I can imagine that the registers would be a linked list of interfaces. Each one would have a GUID. So you'd have an IFrameBuffer interface
Re: (Score:2)
By API you mean DirectX, right?
Or OpenGL, yes.
"Register at base address+0x1010 is a command register. Write these commands to draw these polygons"
Still not sure I see how that's a trade secret. Not disputing it, just over my head at this point.
If you know the instruction set of a Risc chip, an in order implementation is rather obvious.
Wouldn't the same hold, though? There are fast implementations, and there are slow ones.
nVidia is a hardware company. I kind of wish they stuck to hardware.
Re: (Score:2)
So what's the holdup?
This small thing called trade secrets. nVidia's drivers (and I'm assuming hardware specs) contain trade secrets that they'd rather not make freely available to their competitors. The fact that no one else can create a product that can compete with them still tells me that this trend of keeping their trade secrets locked up in a proprietary format isn't going to change anytime soon. To be honest, I'm happy that nVidia even puts out Linux drivers that work with minimal hassle. Sure they may sometimes con
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Informative)
The last time I looked at the graphics scene, they were actually neck and neck. There were reviews for new cards from each, and depending on the publisher, they might go one way or another.
At no point do I remember ATI no longer being relevant.
So, do you have anything to back that statement up, or are you just going to keep parroting the nVidia party line?
Fullscreen TV output? (Score:2)
Not nVidia. (Score:4, Insightful)
But nVidia is the last to publish specs, or any sort of source code. ATI and Intel already do one of the two for pretty much all of their cards.
So, in the long run, nVidia loses. It's possible they'll change in the future, but when you can actually convert a geForce to a Quadro with a soft mod, I very much doubt it'll be anytime soon.
Yawn (Score:5, Insightful)
Another paid for article. Yawn.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Informative)
Re: (Score:2)
You're right about the average business user's need for desktop horsepower, but you overlooked the main consumer of business MIPS today, and that's Symantec Anti Virus. We used to depend on Windows version updates to slow everything down so we could upgrade our hardware, but now we just have to upgrade Symantec. I wonder if any of that work could be off-loaded to the GP
Re: (Score:2)
LOL (Score:2)
The terminal emulator runs slow on anything less
Heck with physics processors and GPU, i need an AV card and i could go back to Pentium 3...
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Slightly OT,but does anyone know where I can find a micro that has at least one USB and preferably runs Linux? I have to fit the CPU into a 4in diameter rocket and so far most of the ones I'm finding require daughter boards that won't fit.
http://www.gumstix.com/ [gumstix.com]
or
http://gumstix.com/waysmalls.html [gumstix.com]
As they say "linux computers that fit in the palm of your hand"
I believe the verdex boards are 2cm by 8cm.
Price is about the same as desktop gear, figure you'll drop about $250 on a basic working system.
Re: (Score:2)
Re: (Score:2)
Re:Yawn (Score:5, Funny)
The future according to anonymous coward.. more trolling, offtopic, flamebait-ness, with the odd insightful or funny.
The future according to Ballmer.. inflated Vista sales (it's his job, damnit!).
The future according to Microsoft shill 59329.. "I hate Microsoft as much as the next guy, but Vista really is t3h w1n! Go and buy it now!"
The future according to Stallman.. Hurd.
BTW The promo video for HURD is going to feature Stallman as a Gangsta rapper, and features the phrase: "HURD up to ma Niggaz."
Re: (Score:1)
Might want to put that all-important ~ at the end of your post next time. So some zealous-o-mod doesn't hit you with an off-topic, just because they think you're being serious and disagree with you.
Re:Yawn (Score:5, Funny)
Zen koan. (Score:3, Funny)
Ahhhh. You are a Zen Master. Please, teach us more!
Re:Yawn (Score:4, Insightful)
- What's the future according to Minute Maid anyways? Really, I'm intrigued!
- Did you notice the interesting parallel between the future according to ATT and what the american government seems to be steering to? More bars in more places (and as many people behind them as possible(?))? What a strange coincidence...
Re:Yawn (Score:5, Funny)
Re: (Score:2)
Re: (Score:1)
but corn syrup really takes the cake, pun intended, in getting people fat.
Re: (Score:2)
The future according to Nvidia - faster GPU's using more stream processors.
Re:Cell Proc? (Score:1)
Re: (Score:1)
Even in a games console it's probably hard to keep all
In the Year 2000 ... (Score:1)
Guest: "The future, Conan?"
Conan: "That's right, Let's look to the future, all the way to the year 2000!"
and then
"In the Year 2000"
More GPGPU = More Parallelism (Score:2)
Not all of the uses for the gobs of cheap parallel processing power are apparent yet. But people will find cool things to use it for, just as they found cool things to use home computers for. In a way, we are now going through a home supercomputing revolut
Price / Perfomance works for me (Score:4, Informative)
The more competition the better.
Anyone that worries too much about the cost a good GPU adds to the price of a PC, doesn't remember much what it was like when Intel was the only serious player in the CPU market.
This kind of future, to me, spells higher bang for the buck.
Really? (Score:2)
Yep, I'm sure the Intel Devs have all taken a sabbatical.
Re:Really? (Score:4, Informative)
The ones that work on GPUs? I'm not sure they ever even showed up for their first day of work.
Surprise, Surprise... (Score:5, Insightful)
Re: (Score:2, Interesting)
Re: (Score:2)
Re: (Score:2, Interesting)
Re: (Score:2)
This answer is very interesting because I seem to remember that MMX was introduced because Philips planned to create specialty co-processor(s) (boards) (around 96/97) to off-load multi-media tasks, so that sound processing would take less CPU cycles and to introduce video processing. Intel did not like this idea and added MMX just to cut off such things.
Re: (Score:2)
I am a bit skeptical. If AMD's experimentation with combining the CPU and GPU bears fruit it might actually mean the end for the traditional GPU's. nVidia doesn't have a CPU that can compete with AMD and Intel so I think nVidia is the one in trouble here. But I suppose nVidia has to keep up appearances to keep the stocks from plummeting.
I would concur with that. But add nVidia is also missing an OS and applications. While it is an extension of not having a traditional binary compatible CPU to Intel and AMD, nVidia is totally void here.
My guess is Intel has the weakest video, perhaps is talking to nVidia and nVidia is trying to pump the value. While AMD is trying to see how to best integrate the CPU and GPU. That is, this is about politics and price for nVidia.
Some problems today's GPUs have are: they run too hot, take too much powe
Sounds like BS (Score:1, Flamebait)
As to the claim that the GPU will replace the CPU: Not likely. This is just the co-processor idea in disguise. Eventually this idea will fade again, except for some very specific tasks. A lot of things cannot be done efficiently on a CPU. I have to say I find the idea of
Re: (Score:2)
If nVidia or any other GPU manufacturer tries to get too generalized they run the risk o
Re: (Score:2)
The hi-light of the press conference seems to be the censored part revealing that nVidia will be fab'ing ARM-11s in the near future in direct competition with the Intel Atom. Looks like they're not planning to go down without a fight...
Competing (Score:4, Insightful)
Well API isn't their department (Score:4, Interesting)
This could be an area that OpenGL takes the lead in, as DirectX is still rasterization based for now. However it seems that while DirectX leads the hardware (the new DX software comes out usually about the time the hardware companies have hardware to run it) OpenGL trails it rather badly. 3.0 was supposed to be out by now, but they are dragging their feet badly and have no date when it'll be final.
I imagine that if MS wants raytracing in DirectX, nVidia will support it. For the most part, if MS makes it part of the DirectX spec, hardware companies work to support that in hardware since DirectX is the major force in games. Until then I doubt they'll go out of their way. No reason to add a bunch of hardware to do something if the major APIs don't support it. Very few developers are going to implement something that requires special coding to do, especially if it works on only one brand of card.
I remember back when Matrox added bump mapping to their cards. There was very few (like two) titles that used it because it wasn't a standard thing. It didn't start getting used until later, when all cards supported it as a consequence of having shaders that could do it and it was part of the APIs.
Re:Well API isn't their department (Score:5, Informative)
GPGPU absolutely demands specialized APIs - forget D3D and OGL for it. These two don't even guarantee any floating point precision, which is no big deal for games, but deadly for GPGPU tasks.
Re: (Score:1)
And GPGPU can be done with OpenGL 2.0 - approx. 10 months ago, we presented a Marching Cubes implementation in OpenGL 2. 0 that even outperforms its CUDA competitor
http://www.mpii.de/~gziegler
So donÂt throw out the GPGPU baby with the floating point bathwater
Re: (Score:2)
Re: (Score:1, Funny)
Well, DUH! It has the word "open" in it, doesn't it?
Will they become platform supplier? (Score:5, Interesting)
That would be great! (Score:2)
Wouldn't that be great! It's about time that graphics processing, IO, an other things are sent to their own processors. Anyway, wasn't that done before - Amiga?
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Also, those early tests
http://techreport.com/discussions.x/14584 [techreport.com]
suggest that Isaiah, when it comes to performance per clock, is finally comparable with AMD/Intel. Who knows what we'll see later...
PS. Games are _the_ only thing (an
Re: (Score:1)
In fact I think even that is overkill - you could add a framebuffer, hardware cursor and a blitter to the core chipset and steal some system RAM for the actual video memory. Negligable die area and low power consumption.
Re: (Score:2)
(nvm that I absolutelly hate them - I prefer something more portable, but economy of scale doesn't work for my advantage)
Re: (Score:1)
But the mass market doesn't want laptops like this. They want something which lets them run MS Office at work or a web browser and and an email client at home. In machines like that discrete graphics doesn't really add much performance and it kills battery life. It also adds a few bucks to the build cost. So companies like NVidia don't really have anything which that market needs.
Re: (Score:2)
But OEMs that build cheap laptops supposedly want not only that, they want integrated package, with everything (not only chipset/chips but also CPU) included in one nicely tested/supported solution.
The whole point - I wonder/suspect that Isaiah and its refinements
Raytracing (Score:1)
Re: (Score:2)
nVidia's split personality (Score:5, Interesting)
http://scarydevil.com/~peter/io/raytracing-vs-rasterization.html [scarydevil.com]
However... Dr Philipp Slusallek, who demonstrated how even a really slow FPGA implementation of raytracing hardware could kick general purpose processors (whether CPU or GPGPU) butts in 2005, has been working as a "Visiting Professor" at nVidia since October 2007.
They're still playing their cards close to their chest.
Re: (Score:3, Informative)
MY - YES - MY can of WOOPASS, Telsa has arrived! (Score:1)
Future? How about the present? (Score:2)
So nVidia, instead of spouting off about how great the future's going to be, ho
Re: (Score:1)
It's a bit strange they'd support CUDA on linux but not vista though.
so basically... (Score:2)
Funny how things work, isn't it.
Re: (Score:1)
Haven't we seen this all before? (Score:1)
Re: (Score:1)
It has indeed, and there's certainly grounds for caution, but there's a chance that this time it's different. There are different silicon processes involved in making a fast vector processor (as one needs for GPUs) compared to what one does to make a CPU, so putting them together isn't simply a matter of finding enough space in the package. Couple this with the fact that C
The merging of CPU and GPU (Score:1)
Intel obviously sees the threat of the GPU creators, but their attempts at breaking into the GPU market hasn't been very successful.
Their next generation effort is called Larrabee [theinquirer.net]. Which uses multiple x86 cores linked with a ring bus.
It actually reminds me of PS3 SPU setup but Intel is using the GPU functionality as a wedge into the GPU market, instead of pushing it for general computation. But, since standard C code will work on it, you can rewrite the entire stack to be a physics co-processor or fol
The Future of Computers: The GPFPGA? (Score:2)
This has been done before. (Score:2)
Bigger and bigger (Score:2)
Some other tasks focus on "trimmed down and more efficient" but then tend to fail in the power output arena.
I was wondering how difficult it might be to make a motherboard or graphics card with multi-processors. One small one for general-purpose computing (basic surfing, word-processing, 2d graphics or basic 3d), and a bigger one that could be used to "kick in" when needed, like an ove
Re: (Score:2)
Sounds like a good idea (Score:2)
silly (Score:2)
But I don't think they will catch on. It makes little sense for people to stick extra cards into their machines for computation. Instead, you'll probably see
How About Some Backwards Compatibility? (Score:2)
I call bullshit (Score:1, Interesting)
Fuck nVidia.