

3DLabs Launching New GPU 204
h0tblack writes "...or VPU as they've seen fit to call it. The Register is reporting that 3DLabs will be releasing the P10 later this year. It's targeted at workstation and gaming markets with OpenGl2.0 and DX9 drivers having been seeded to developers already. Could be interesting as 3DLabs have been one of the key players in the development of OpenGL2.0. The P10 has over 200 SIMD processors throughout its geometry, texture and pixel processing pipeline stages to deliver over 170Gflops and one TeraOp of programmable graphics performance together with a full 256-bit DDR memory interface for up to 20GBytes/sec of memory bandwidth. More info can be found in the press release." There are also examinations of the new chip on Anandtech, Tom's Hardware, and no doubt many other hardware sites too.
Heh (Score:2, Funny)
Damn why we spend these bucks that PC architecture...
Capitalism...
new kernel option? (Score:3, Funny)
Processor type and features
...
Floating point emulation? [y/N]
Floating point acceleration via 3dLabs VPU? [Y/n]
But what about the gaming market? (Score:1)
Re:But what about the gaming market? (Score:1)
Could be interesting, but (Score:3, Interesting)
Anyone Remember the Permedia? (Score:2, Troll)
The specs were great, but the actual implementation and drivers, well, sucked hard.
Re:Anyone Remember the Permedia? (Score:2)
Actually, the Permedia was a nice card for it's time. The OpenGL drivers were better than anything else out there (remember, this was back when the Voodoo 1 was king of the hill, and the OpenGL drivers were perpetually beta and you couldn't run in a window anyway).
Re:Anyone Remember the Permedia? (Score:1)
It was especially bad as they were marketing it as a gaming card initially. And, I actually put off buy an Nvidia waiting for it.
I had a Permedia 2 (Score:2)
The specs were great, but the actual implementation and drivers, well, sucked hard.
Sure, the Permedia wasn't the quickest card on the block in its time, and neither was the Permedia 2 nor the Permedia 3...
But both the NT and Win9x drivers were absolutely 100% rock-solid, the OpenGL implementation was flawless and very, very fast, and the card supported a whole bunch of features that no other consumer-level chipset at the time supported, like anisotropic filtering, or multiple video overlay windows at once. The RAMDACs were really good on the Permedia 2 also - razor-sharp, much much better than the TNT2 I ended up replacing it with. It was also rather faster at GUI acceleration than the TNT2, which was a surprise and a disappointment.
Really it was a semi-pro card at consumer-level prices. It would never have been the card you bought if you wanted the ultimate Quake framerate, but it absolutely oozed quality.
It's the only graphics card I've ever used that hasn't annoyed me in some way, be it dodgy image quality (NVIDIA, S3), unstable drivers (ATI, NVIDIA), bus latency greediness (NVIDIA, S3, Matrox, often leads to choppy, stuttering audio), or just being dog-slow (all the usual suspects - hello Trident, earth calling). I've never used a 3dfx card for more than a few minutes so I can't really comment on them, but I suspect their poor OpenGL support would have annoyed me greatly.
If only 3Dlabs had 3d-accelerated Linux drivers (preferably open source) I'd buy another one in a heartbeat. I've been disappointed with every other card I've had since my Permedia 2...
Re:I had a Permedia 2 (Score:1)
Yes (Score:2)
If Creative makes a card with them with OSS Linux drivers and *NO FAN* then I'm sold!
no fan? (Score:3, Funny)
there you go! you're welcome!
oh, you DIDN'T want it to catch on fire...
-c
What are those GFLOPS they mention? (Score:1)
Re:What are those GFLOPS they mention? (Score:1)
Re:What are those GFLOPS they mention? (Score:1)
Re:What are those GFLOPS they mention? (Score:1, Informative)
multiplying 4x4 matricies and 4x1 vectors, and you
pay very close attention to the programming docs,
then yes you can perform 170 billion floating
pt ops per second. But it's not something you
could use as a general purpose processor.
Re:Wow! (Score:2)
Re:What are those GFLOPS they mention? (Score:1)
GPU= Graphic Processor Unit. There you go.
adding to my last post (Score:1, Interesting)
Re:adding to my last post (Score:1)
Re:adding to my last post (Score:2, Funny)
High end? MySQL? BWAHAHAHA! Good one.
Re:adding to my last post (Score:2)
High end? MySQL? BWAHAHAHA! Good one
I'm pretty sure he meant high level, as in more complicated than.
--
Evan
i don't get it (Score:2)
And what, pray tell, does armed with nasm mean?
How long til we see THIS Slashdot article? (Score:4, Funny)
Really, these things are getting massively more complicated than your ordinary P4 or Athlon.
And think; There's one less layer between the OS and the framebuffer!
Re:How long til we see THIS Slashdot article? (Score:4, Informative)
Not really, though. They have simple units, then they put a whole bunch of them on there. They don't need nonsense like branch prediction and register renaming and all that. But they certainly are complicated in their own way.
I'm waiting for return to bus-based computing (Score:3, Interesting)
Each card is a basically a CPU board with its own memory. The common bus between cards is really a switch to limit card-card contention. One card is the bus master running the kernel. Processes can be shuttled between CPU boards as processing power is available.
The thing is we're getting to the point where just about every PCI device has a CPU on it (NICs with encryption/acceleration engines, RAID cards). Why not just put high-speed general purpose CPUs on the cards and use it as a highly integratable/segmentable cluster?
The actual kernel could do more scheduling and less work, since the "NIC" CPU card could theoretically run large parts of the IP stack in addition to the NIC driver, as an example.
Re:I'm waiting for return to bus-based computing (Score:2)
About how much horsepower do you think you would need to do something like IPSec? Is that handled by a secondary processor already anyway?
Re:I'm waiting for return to bus-based computing (Score:1)
each card would have its own driver and all the OS would need to do is know how to communicate through a standard interface in order to use the device resources (like the network stack)
this would make OS development simple.
that is how a microkernel would work well.
smart periferals that control their own resources.
Re:I'm waiting for return to bus-based computing (Score:2)
Re:I'm waiting for return to bus-based computing (Score:1)
Re:I'm waiting for return to bus-based computing (Score:1)
Re:I'm waiting for return to bus-based computing (Score:2)
Re:I'm waiting for return to bus-based computing (Score:2)
Blades are just space consoldators from what I've seen; there's no common bus for moving data or memory.
Fitting a whole computer on a card and injecting the display onto the host doesn't really count either. There's usually no way to move data between the environments, and since they run incompatible processors there's no way to offload processing from the host to the card and vice versa. They're often no more than x86 emulation accelerators.
The system I'm thinking of actually has the blades working together, sharing a commmon bus, potentially sharing memory as well via NUMA type architecture.
Re:I'm waiting for return to bus-based computing (Score:2)
Back in around '90-91, DEC was building SMP (up to 4-way) 486's that used a 'corollary-bus'. There were somewhere between 16 and 20 slots on a *VERY* sparse motherboard. Each card had a specific purpose: CPU cards (up to 4), memory cards (also up to 4 I think, possibly 8) and the rest were general-purpose EISA slots IIRC. Typically you'd have SCSI and something akin to a Digiboard for your pre-TCP/IP network.
BTW - didn't Digiboard RULE?! Best products and support I've ever come across.
sedawkgrep
Beyond3d (Score:4, Informative)
Bleeding? (Score:2, Interesting)
One great feature is the virtual memory, which should improve the appearance of depth and richness of models. I wonder how much more textures designers can cram onto a model? Does this mean more games will start to utilize multi-pass rendering and ID will rewrite their engine once again for models with massive amounts of textures? I haven't kept up with the latest trend in 3D game technology, so someone more informed can tell the rest of us?
A question... (Score:4, Insightful)
When Geforce3 came out it didn't have much of a clock speed increase, but boasted features that if taken advantage of by the developers would make the games look *MUCH* better. And yet, the only trend in the gaming industry that I've spotted is cranking up the poly counts.
Re:A question... (Score:2, Informative)
Re:A question... (Score:2)
Re:A question... (Score:2)
Re:A question... (Score:1)
Re:A question... (Score:3, Funny)
1xAA? (Score:2)
Re:A question... (Score:1)
Re:A question... (Score:2, Insightful)
Why do the game companies think that we really care about how cool neato wow the water looks? If it looks vagely like water then yeah, I'll think that's it water and keep on playing the game. You don't have to wow me with water effects, modeling every drop of water as it is absorbed by my characters clothes in the game.
BethSoft had better make this game work unlike Daggerfall. Daggerfall sucked unless you were some insane fanboy willing to put up with the constant crashes and headaches caused by this buggy piece of crap. You had to be in love with the concept of the game to truely like it. I'm tempted to actually buy a X-box for it because I don't trust them with my PC to play it.
I think that these new GPUs are too powerful. As nobody can possiably generate the artwork that will use them quickly enough. It takes much longer to generate a 100,000 poly model then a 5000 poly model in a program like 3d studio max. (assuming that they are of equal quality) It's going to be a couple more years until we see any games really taking advantage of these new features.
Re:A question... (Score:1)
The Gaming industry tends to be behind the curve in utilizing the more advanced features of a card since 90% of their audience is still using the one of the several previous generation cards out there. My aging gaming box only has a TNT2 Ultra, and I havent noticed a sudden lack of ability to play recent titles.
Given the price of the card it seems to be targeted at the smaller animation shops with a few animators running various 3d apps(most of which cost between 2x to 9x the $900 model per seat) on NT Boxen, rather than the Quake "I need to run at 1500 FPS for nothing more than my own phallic extension reasons" crowd.
Re:A question... (Score:2)
But are these programs limited by the power of the card, or the ability of the CPU to feed it information?
Aside from simply supporting the features of OpenGL, are the GeForce 4Ti's slowing down the 1.9 gig Athlons, or the other way around?
-l
Re:A question... (Score:4, Interesting)
That describes the market a few years ago, but no more. These days, with GeForce 2 MXs being dirt cheap and no one having performance issues with them, no one--except neurotic geeks--gives any thought to updating their video cards.
But can somebody tell me if there are products currently on the market that take full advantage of the *current* crop of video cards?
The answer is an emphatic "no." I'm a game developer, and we were focusing on the Voodoo 2 as the low end until very recently. And the Voodoo 2 is still a much more powerful card than people realize, providing you work *with* it and don't just ask it to render 50,000 polygons per frame. I don't think we ever got to the bottom of the performance available in that card, and we certainly, certainly, never got anywhere near what you can really do with later cards, like the original GeForce. All of the fancy stuff you can do with the GeForce 3--mostly based around vertex shaders--is not backward compatible with 90% of the market, so we never touched it.
Fanboys don't want to hear that their cards aren't being pushed anywhere near the limits. The are much happier to have poorly written games that have high polygon counts and bad art, because then they can justify the money they spent on a new computer and/or video card.
Re:A question... (Score:2)
Re:A question... (Score:1)
They said that the game will run at rates of about 30 with a GF3, not due to bad architecture, but maximizing the card can give. They said that the high end computers at the time of shipment will still run the game at around 60-70fps, stable.
Anyways, until u can't render stuff like Final Fantasy (the movie) in real time, you aren't there yet
Re:A question... (Score:2, Funny)
Re:A question... (Score:2)
Life, as it is, is fluent "infinite FPS". When you capture life on video, you capture everything in that 1/30 of a second, including all movement.
If u look on a frame in a movie, u see that everything is blurry, but when it's all running, it's clear.
Computer graphics are created frame by frame, "like life", so to get the maximum fluidity, like in real life, you need this infinite FPS.
I might have written this draftly, but I simply can't find the page I read all about it.
Re:A question... (Score:2)
There is a much better solution, and 3DFX was attempting to introduce that, namely Motion Blur
The problem is that all those motion blur effects are linear and create even remotely realistic looking images only for objects that move linearly for the time period. If the object accelerates and rotates in the same time the required motion blur isn't linear and the only way to make it look good is to render a lot of frames for the time period and blend them together. Not to mention morphing of object; imagine a bullet hitting a wall -- in 1/30th of a second the bullet is moving towards the wall, morphing during the hit and bouncing to some direction, all during the single frame. How on the earth are you supposed to render it realistically if you only calculate positions for the start and the end of the period as normally done for those real time "motion blur" effects?
Re:A question... (Score:3, Interesting)
Not over the GF2 Ultra series, but it was a pretty big jump from the MX and GTS cards most people had. In addition to the HUGE FPS jump in games like Quake III, it had all those eye-candy programmable things that are going into things like Aquanox and The New Doom (tm). Also, the memory increase to 64 then 128 megs of DDR graphics RAM allows for insanely better Anti-Aliasing at "normal" gaming resolutions like 1024x768. The NV25 core (GF4 Ti series) increases this further, where you can turn on full-scene anti-aliasing and still get killer performance in your old games.
I only play Quake 3 and RTCWolfenstein on a regular basis, but my GF2 GTS (on an Athlon XP 1600+) pushes a masochistic 0.3 FPS in Quake 3 demos with 4xFSAA. Testing with the new card (128 megs of 600MHz graphics RAM, I never could have imagined in 1999) shows that I'll turn on 8 way Aniso, 4xFSAA and STILL gank 60fps on my 1024x768 LCD. Starting at $199, which is my limit for a graphics card.
And trust me, there is a TON of difference in visual quality with 4xFSAA on using a 15" LCD.
So yes, the programmable pixel shading pales against the power of prettier pictures in your "old stand-by" games, like Q3A. (Alliteration is your friend.)
Re:A question... (Score:2)
Unreal was beautifull and i like the music. So i enjoyed it on my (rip) Voodoo II. After that, better graphics just make me bored after the initial "cool graphics" experience.
As another guy already said, not even the Voodoo II has been maxed out yet. AA looks _definetly_ good, but are those games more fun? If the game experience (what you do, how inmersive) doesn't get better, then better graphics just ruin the game.
Another thought: I still like the pixel in Doom II combined with high framerate. It's like real life through a wet lens. But a high framerate with AA and everything, if the game is not really really we done (Unreal II level or upper) just looks like a crappy movie seen through a high quality microscope.
Re:A question... (Score:1)
I don't think a GeForce 4 4600 could handle it so yeah, they can use the processing power =).
Re:A question... (Score:2)
Re:A question... (Score:2)
Is this the technology PS3 needs? (Score:2)
Re:Is this the technology PS3 needs? (Score:1)
That's funny, somehow they have sent DX9 drivers.. (Score:1)
Re:That's funny, somehow they have sent DX9 driver (Score:1)
Re:That's funny, somehow they have sent DX9 driver (Score:1)
Also keep in mind hardware manufacturers have a lot of input into the new features implemented in DX and M$ is more than happy to bring them in to consult as the production progresses.
It would not surprise me if the 3DLabs people have had alpha copies of DX9 to play with for a few months now.
impressive specs (Score:1)
Creative has bought 3d labs (Score:4, Informative)
It doesnt mean anything to me (Score:1)
Absolutely nothing
Anyone can take any product and make a glowing press release over it to get everyone excited about it, but that doesnt say anything for the silicon, or its support chipsets, or its drivers when it finally reaches production
until then
OpenGL 2.0 (Score:1)
I hope having a chip out like this doesn't affect the adoption of OpenGL 2.0 by other card/chip manufacturers. I also hope OpenGL 2.0 won't be to 3DLabs what Direct3D/X is to Micro$oft.
Re:OpenGL 2.0 (Score:2, Informative)
Standards? (Score:1)
Re:Standards? (Score:1)
You can find a bit of info on the OpenGL 2.0 shading language over at
3DLabs' white papers section [3dlabs.com]. There is also quite a bit more information on OpenGL 2.0 there as well.
Re:Standards? (Score:1)
A neat idea, but not feasible within constantly evolving graphics processor industry.
This is what OpenGL 2.0 is about (Score:5, Informative)
OpenGL 2.0 addresses exactly your concerns - a vendor-neutral shader programming language, and this is precisely why you're seeing 3Dlabs pushing hard for it. It seems they will be first to market with a fully programmable graphics pipeline, and they need the software technology to go with it...
DirectX 9 also addresses the same issues and provides a standard shader language (actually DirectX 8.1 has a standard shader language already, but it lacks a certain amount of the programmability that will be present in DirectX 9), but there are a lot of reasons for the graphics card vendors to favour OpenGL over DirectX. For instance:
Hopefully OpenGL 2.0 will see a resurgence in OpenGL use. I don't like the idea of the 3D market being controlled by Microsoft, and I don't think the 3D vendors do either. Kudos to 3DLabs for leading the way!
TI 34010... (Score:4, Insightful)
Any website proclaiming full programmability as new or revolutionaly is simply demonstrating a lack of historical knowledge. 34010/34020 based boards competed with the first-gen fixed function graphics accelerators for Win 3.x, but couldn't compete on price/performance with the fixed function BitBLT engines from S3 et al, and the flexibility of being fully programmable meant nothing to PC users who were accustomed to dumb EGA/VGA cards.
Re:TI 34010... (Score:3, Interesting)
The 34010 kicked butt! It was used by Atari's Hard-driving game. It had a lot of neat features, including hardware X/Y addressing (i.e. move x,y,pixel), bit-level addressing (you could twiddle any bit in memory, or write a word/byte on any boundry), and built-in simple graphics operations (copy a block of memory, xor source & destination, use larger of the two, subtract, union, difference, add but don't overflow, etc)
But what was *REALLY* cool was the math coprocessor, the 34020. It was blazingly fast (almost, but not quite as fast as the industry-crushing i860 IIRC), but it featured a programmable microcode so you could create your own instructions and get every ounce of performance out of the machine. I'm still looking for a processor that will allow that... we're getting those with modern NPUs (cradle [cradle.com], intel IXP1200 [intel.com]), but these generally lack floating point functionality.
End of VGA (Score:2, Offtopic)
Re:End of VGA (Score:2, Interesting)
Re:End of VGA (Score:1)
I think Tom mentions this simply because the P10's capabilities to handle multiple requests is a good solution to the requirements set forth for M$'s next-gen GUI, Longhorn.
P10 shows Longhorn is possible and that VGA is no longer needed. This is the "End of VGA". however I'd expect legacy support for VGA in video cards for a long time to come.
Where's the Oy! (Score:2, Funny)
Wow..... (Score:1)
Opining on the Why: Creative's issues w/ hardware. (Score:2, Redundant)
blockquoth the poster (evermore with emphasis added):
Now then, the emphasized bits beg the question: Why has Creative gained and lost its footholds in these areas?
For this Creative customer, the reason is and has always been (across all product lines) one, very important issue: Software.
When and where the Creative development machine manages to mate decent, uncluttered, non-glitzy, tweakable, and trouble-free software (very very seldom IMO/) to the excellent-to-amazing hardware that they are deservably famous for, the results have been very good indeed.
However, in the normal course of events, Creative's hardware ships with installation, driver, ancillary programs, updaters, bundled "features", and enough just outright useless crap to annoy any self-respecting consumer. And while I admit that this occurs largely on the Windows platforms, you should admit to yourself that that's Creative's largest area of concern. Fortunately, they haven't yet figured that they could push for inclusion of enough Creative ad-ware to sicken a telemarketer drone into the driver packages for other platforms.
So, in this reader's experience, the issue is simple. Too much software that users don't want or need, too many features that won't work without all the glitzy junk (anyone like using the LiveDrive product, it's great, but the software to make it worthwhile--remote control--is a cast-iron bitch, crashy, seldom-updated, and too tied to useless trash in the installation). Now these issues seem somewhat prevalent along Creative's product lines, and they're killers.
Fortunately, the answer is simple. Creative needs to give the people who buy their hardware good, stable, and full-featured drivers without the need for a dozen attendant Creative-logo-displaying bits of crapware. If that parts' impossible, then it'd at least be nice to be able to grab reference drivers from the chipset manufacturer (how many people don't use NVidia's Detonator drivers in favor of the card-vendor's?)
.
Failing those... license the hardware designs to vendors who'll give us good, honest, and stable software. Of course, they can always continue to lose business to the competition, afterall, it's . . . "good for the market".
Could this be why nVidia (Score:1)
but that's speculation.
Re:Could this be why nVidia (Score:1)
bump: TNT2
new: GeForce
bump: GeForce2
new: GeForce3
bump: GeForce4
new: this fall
I don't see where the problem is here. Maybe the schedule you're thinking of is flipped.
Re:Could this be why nVidia (Score:2)
Re:High-End Video Cards (Score:4, Interesting)
Speak for yourself, I'm a gamer, and I'm more than willing to fork out $900 for a good video card. Hell if I spent $700 on the Geforce1 DDR when it first came out, why the hell not spend $900 on a fully opengl accelerated card? I've seen the current generation of High end cards from 3DLabs, and if this new generation is anything like the current, it's worth the $900 for gamers.
Re:High-End Video Cards (Score:1)
Re:High-End Video Cards (Score:1)
Re:High-End Video Cards (Score:2)
Re:High-End Video Cards (Score:3, Informative)
No, that's not a typo, these graphics cards cost as much as a nice Athlon system.
I don't care. It's still a lot cheaper than a top of the line SGI workstation.
The ratio of costs for all the parts in a typical PC)
(motherboard:CPU:disk:powersupply:OS:graphicscard
have shifted some over the years. More accurately, though, as the performnce of certain keys pieces has increased to adequately fulfill the needs of the users, it's natural to start looking to satisfy unmet needs.
An OpenGL card like this would be wonderful for scientific visualization, CAD, CAM, etc.
While the price is an important point, in my market $600-$900 is not a big deal.
Re:High-End Video Cards (Score:1, Troll)
- A.P.
Re:High-End Video Cards (Score:1)
Re:High-End Video Cards (Score:2, Informative)
I think that puts you very squarely into the "fuckwit" category, so the original poster was still right.
Re:High-End Video Cards (Score:2)
new generation is anything like the current, it's worth the $900 for gamers.
The Wildcats deliver a whooping 9 fps in Quake. That's nine frames per second. I work in 3D animation and I'd love to have a Wildcat, but to play games, no thanks. Let's hope the new processor is a bit more gamer-friendly (like nVidia's Quadro4, for example).
And personally I'd never spend more than 300 on a gaming card (I would - and in fact have - on a professional card if I thought it was a good investment).
RMN
~~~
Re:High-End Video Cards (Score:1)
When the GeForce3 first came out, Apple sold it for $499. Only when it was released for PC consumers did it drop to the prices we see today.
Re:High-End Video Cards (Score:2)
3D Labs was recently purchased. I won't bore you with the details but the purchasing company was none other than Creative Labs. Creative Labs' focus has not historically been the professional workstation, it has been mainstream consumers.
Although the initial cards brought to market will be targetted to the workstation market, it is highly likely that Creative Labs will leverage this technology to produce a card targeted to the gamers market. One of the benefits of the architecture is that it can achieve a larger number of textures with a more limited amount of memory through caching. This will allow Creative Labs to trade off memory size for memory speed in the gamers market.
Re:High-End Video Cards (Score:1, Troll)
The first and the last categories are one and the same for these purposes.
Re:High-End Video Cards (Score:1)
I know you are, but what am I!:P
I consider myself a gamer AND a creator. So, given that the card performs as well as it should, I'll gladly have my cake and eat it too, thank you very much! Fuckwit!
Re:Why not... (Score:1)
Re:Not to be offtopic (Score:1)
Re:Not to be offtopic (Score:1)
Re:OpenGL Window managers/desktop environments (Score:2)