Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software

New ATi 3D Chip 97

Cooper writes "Saw a piece on Sharky Extreme about a new ATI chip dubbed the Rage 6, which they say is going to be used on Microsoft's x-box as well as for PCs. It's got an on-board geometry processor like the NVIDIA GeForce. " Wow. 2 gigapixels per second? Wow. The graphics market is starting to really heat up - check the earlier story about the Voodoo 4 &5.
This discussion has been archived. No new comments can be posted.

New ATi 3D Chip

Comments Filter:
  • by jd ( 1658 ) <imipak@yahoGINSBERGo.com minus poet> on Thursday November 18, 1999 @12:17PM (#1521016) Homepage Journal
    2 gigapixels per second... BBC resolution was 320 x 200, in mode 4... Hey! I can play Revs at 1 million frames per second! :)

    Seriously, though, sheer pixels per second is kind-of meaningless. (Actually, it'll be state changes per second. :) In most vector-based or polygon-based product, the bottle-neck is in calculating the outer perimiter of the shapes.

    Personally, I think pixel-based displays are a dead-end, anyway. Aliasing is horrible, and the techniques to get round it do so by making the picture too blurred to tell.

    What's needed is a pure analogue polygon-based display, capable of area fill and non-trivial shapes. 3D would be nice, too. Anyone got a holographic projector they could lend me?

  • Seem to be a lot of announcements to coincide with Christmas shopping. Some of them seem to be the usual vapourware i.e. don't buy a graphics card for Christmas because our sooper-dooper card will be out just after...HONEST!

    Anyway which graphics card is "the business" both now, and which is expected to be ?

    On another note, I'm not too enthusiastic about Microsofts X-Box - seems the forthcoming Sony PSX 2 is going to beat it into the ground, and the spec of Microsofts product doesn't look that different from a PC anyway. Anyone think any different ?

    The Playstation 2 looks like it might be the first games console I'll buy since I bought one that played table tennis games in glorious black & white ! :-)
  • Ack. I've never been so much as a "softcore" action gamer, but even so I really don't see the need for that much horsepower just to play shoot-em-up fantasies as hyper-realistically (oxymoron intended) as possible.

    But deep end video chipsets do have a purpose just a short ways down the road. True immersive VR worlds. 3D gogglevision with lag time well below the perception threshold (10ms, perhaps?). Gibson-esque Cyberspace. The Matrix.

    Between screaming video and fast wires, the technology is almost here. How long will it take for the content to follow?
  • by Erich ( 151 )
    Do you need to plug it into the wall, like you do for the Voodoo 5?

    I found it really funny that because each of the new chips uses 7 watts of power, if you have more than 3 it can't get enough from the bus (AGP provides 20W of power) ... so if you look at the picture there's a brick that you have to plug into your outlet in your wall, and then plug into the back of the card... /me is wondering how much heatsink/fannage you'll need for 4x7watt processors...

  • The graphics market is starting to really heat up - check the earlier story about the Voodoo 4 &5.

    Good god, where have you been? The acceleration of graphical capabilities of PCs over the past two years (starting with 3dfx's DooDoo) has been incredible. Even last years games were aeons beyond those games of a year before them, and before that improvements in graphics technology and 3D were dog slow. We have been in the middle of a revolution in 3D graphics speed and quality for more than two years now. Hardly "just starting to heat up"
  • This probably does imply something about it being extremely powerful (although there's probably a Kurt Vonnegut warning available)...

    This may parallel (fatally, as is the case for automotive analogies) automobile mufflers; I find that the really powerful automobiles have extensive exhaust systems, whilst any car with a "wimpy" exhaust system is itself necessarily "wimpy."

    A graphics card that requires inordinate amounts of power might, on the one hand, be making flashy use of electricity to make it seem cool, but might truly be providing a whopping lot of "rendering power."

    Of course, the killer question is whether or not there will be XFree86 accelerator code that can actually harness this power... Otherwise, these monstrous, smoking-at-the-ears graphics boards may be paper dragons, parallelling the Intel MMX, where few if any programs actually made real use of the optimization...

  • The only thing I disagree with is I think anything polygon based will eventually be a dead end also, and will start yet another generation of more 'organically' styled displays.

    Got quite a long ways before that happens, though - think of the amount of information nessisary for a 15" x 15" x 15" display that produces true 3D holographic display!

    Of course, the next step beyond that would be something that produces not only 3D, but 'solid' displays that can be touched, ala Holodeck concept. Of course - better be prepared to actually duck that rail gun shot!
  • The entire PC world has gone made for the highest speed piece of old tech available. Instead of Better, Faster, Stronger we have Newer (package), Faster (clock speed anyway) Flashier. I understand on a mony saving level, it is way easier for harware developers to revamp their already successful lines of products and in theory if the manufacturer saves some $ then so does Joe User. But how long can this go on? The 8086 is already almost 20 yrs old and still going strong with a bunch of add ons and improvements. What is needed is ome fundamental changes made to the way it is designed and screw backward compatability. That can be emulated :) Perhaps we will see something really new from Carusoe...
  • The tech is here, but when was the last time you saw a set of VR goggels for $200 ? It's the typical tech. development circle, companies won't make it affordable/practical until there's demand. There won't be demand until the products are affordable/practical. It takes one product to break this cycle, but no one wants to devote the resources (LOTS of $$) to develop that first inexpensive product on their own.

    -----
  • It is nice in a free market to vote with your $$.

    A lot of people give a lot of hot air to Open Source in general and Linux in particular.

    Nvidia did a very nice thing by providing an optimized, glx version of the SVGA X server for the TNT cards. 3Dfx has recently made some moves in this area, but I still tend to think they are "closed thinkers" (GLIDE).

    The question, what has ATI done for me lately? Why should I even consider giving them my money?
  • by tukka ( 43211 ) on Thursday November 18, 1999 @12:46PM (#1521028)
    Ever since the TNT, nothing has excited me about video cards. I expect for them to leapfrog each other in speed at regular intervals.

    Of course, larger textures, more polygons and features like full-screen antialiasing and enviornmental bump mapping and true color are great, but it is hard to get worked up over any particular video card when each of them only has bits and pieces of the really cool features, and when it comes down to it, all they really do is allow some minor improvements in image quality and allow you to up your screen resolution and get more frames. Great for the obsessed gamer, but game developers still have to make games with the low end in mind, so the games themselves aren't tremendously more impressive from a graphical perspective.

    For those reasons I'm a lot more excited about the Dolphin and PSX2 than I could ever be about a mere video card, or even a new class of video cards (like those with on-board T&L, which is meeting with mixed enthusiasm from developers btw.)

    But all these incremental advances and all the competition is great. It means that maybe in 3 or 4 years, there'll be PC games where the game developers aren't limited by hardware anymore, but rather the "developmer's bottleneck" will be the designer's creativity, effort and resources. That would be cool. :)

  • Well, not exactly... we just need to make the pixels smaller than the resolution of the eye, and do so cheaply. Whatever happened to those 300-dpi displays I've heard about? (though we may need 600 dpi...)

    I don't think raster displays will work for that. Mine can't even handle 100 dpi and still be sharp. (though I guess I can say my monitor does its own antialiasing ;-) )

    Analog has been tried before -- they're called vector displays (ala Spacewar and early Atari arcade games). For some odd reason they didn't catch on permanently. Go figure.

    Finally, if you want to know how to create a 3-D image, I've found a link [mit.edu] to ways of doing it.
  • by jdube ( 101986 )
    DAMMIT!!! I just got a Voodoo2 1000 PCI card (because of the @!$#%* built in ATI Mach64 I have there isn't an AGP slot - my mobo SUCKS) for $50... now they're coming out with cards that new games will need them to play! Dammit, dammit, dammit! *sigh* there's always Quake 3 and Unreal Tournament... ;)


    If you think you know what the hell is really going on you're probably full of shit.
  • Ah, a gaming card that has a geometry setup engine in hardware? Nice. That was the deciding factor when I bought the FireGL card 2 years ago -- the Permedia chip.

    Now, comparing that FireGL card to other things available, it's quite lackluster.

    • -It won't do 3D in a window

    • -It's geometry setup actually slows down what my dual PIIs could do by themselves (at least, it feels that way)
    I don't know exactly how much of the setup this new hardware will do... This could be interesting!
  • by Hrunting ( 2191 ) on Thursday November 18, 1999 @12:52PM (#1521032) Homepage
    I call it the Eyeball(tm) and I've patented it and GPLed it. It can do over 3 quadrillion equivalent pixels (I say 'equivalent' because it deals in quantum elements, of which there are an uncountable number in a pixel) per millisecond and when combined with another Eyeball(tm) in a dual setup, can actually create realistic stereovisual effects. It can take input from the real world and give it to you in astonishing 3D quality, and with the new additions 'LSD', 'X' and 'Louisville Slugger', you can make it generate colors you may never have seen before. Unfortunately, such advanced technology comes at a price. The incredibly complex nature of the Eyeball(tm) is such that it requires a proprietary socket, the EyeSocket(tm) to interface with your system properly, and of course you'll need a brain (we realize that this excludes a vast majority of the world's game players, but the ones who do have one will greatly appreciate this invention).

    Don't settle for wussy video cards that are limited to only 'pixels' and may or may not be out after Christmas. Use the Eyeball(tm) (or two) and enjoy true reality .. TODAY!

    (please note that overclocking the Eyeball(tm) or removing the Eyeball(tm) from the EyeSocket(tm) in any way voids the warranty and may damage said Eyeball(tm))
  • Sure it was... of course, there are some people who would swear by your typo!
  • See the thread on Glide being open-sourced. If we had some specs, maybe there's more than meets the eye here.

    But, essentially, an implementation is available.

    Pan
  • First it's OS of the month (Win 2k, win98, win ce) then cpu of the day (PIII, Athlon, K6-3) and now gfx card of the week. It's all talk for a computer that may never be seen in stores. MS doesn't do computers-if they did, even gateway and Dell would go nuts at the idea that MS is competing with them.
  • Yeah, but I'm sure it's going to be obsolete next year, just like the rest of the graphics hard right? ;-) Plus, is there really a market for something like this? I mean, isn't this new generation of cards more than enough power for any game or application that end users could possibly want to run in the first place?
  • Personally, I like the expanded colour range are on the fly hardware particle generation.
    All you need to do is press down on the socketsto reseat the firmware for a minute or two.
    Days of fun.

    Pope
  • Perhaps Crusoe is just an interpreter that translates x86, MMX, 3DNow, SIMD, Velocity Engine, MIPS, PowerPC, (and on, and on, you get the point...) code into instructions for an as-yet-unannounced, completely new processor where it will then run at unheard-of speeds and efficiencies.

    Think of it as a hardware-interpreting-coprocessor!

    • Yes, it goes without saying that if there exist drivers for some of Windows 95, 98, NT 3.5, NT 4.0, 2000

      It's far more fun to forget about that... :-)

    • The comparison to MMX is still reasonably fair; there has been much talk, but between the varying implementations from different vendors, the deployment of useful drivers is doubtless rather limited, with "accelerated" games coming mostly as demos that come with a particular graphics card...
    • As for karma, all you need do is to post things that will be perceived as interesting. If you maintain enough self-control to avoid flame-bait, and have some useful information, your karma will head up into the hundreds...

      (No, there probably aren't many people with hundreds of karma points. My count is only in the "single" hundreds, not in multiple hundreds...)

  • i think dell just said it is going to start focusing more on things other than hardware.



  • All of us in our shop have been reading reports of the "new-and-improved graphic" graphic chips that supposed to fly at this and that speed, in the giga-pixels-per-second range.

    All is well, if those giga-pixels-per-second can be translated into SERIOUS USE.

    For a gamer, the giga-pixels performance might be sufficient, but for other serious uses - like REALTIME SIMULATION for example, giga-pixels-per-second performance do us any good if it can't do simple refreshs at 30 frames/second or more, at the resolution of 2048*2048*32bits/pixel.

    REALTIME simulations _ARE_ important to our shop, and the adage "A picture is better than a thousand words" rings very true to us, for there is no way we can catch faults by pour over the tera-bytes of data generated on a typical simulation.

    We have reached our human capabilities in our demanding simulation environment, and when we _NEED_ the hardware to support us, all we see are marketing hypes that do not translate into real use for us.

    My hope is that one day the marketing hype will go away, and serious users like us will get the products that we truly require.

    Anyone know how long we have to wait until _REALLY USEFUL_ products will come to the market?


  • Um, because ATI has released the specs for all their current Rage chips? Including info on the TV tuner included with the All-in-Wonder series?

  • I seriously doubt that the Transmeta Crusoe [transmeta.com] will be a CPU that would be as useful for running "application" logic as it would be for running "graphix processor" logic.
    • The former ("application processor") tends to involve grabbing bits of memory from here and there, comparing them to other bits of memory, jumping, often adding something, and sometimes calling subroutines.

      That may be a gross oversimplification, but there you go...

    • The latter ("graphics processor") will be doing a whole lot of operations involving XORs, filling regions with values, and even doing some tight loops oriented towards filling regions with "shading."
    The graphics processor is rather more likely to find useful some operations that do "mass updates," which is rather like what a DSP does; that is quite different from the "lots-of-control-statements" that you'll get with a "conventional"/"application-oriented" CPU.

    The patents that Transmeta has been granted somewhat confirm this point of view; the patents represent ways of optimizing the emulation of those "lots-of-control-statements."

    There may be a real killer graphics chip right around the corner, although it is easy to argue that the last three years have involved the continuous release of successive generations of "more-and-even-more-killer" graphics processors.

    I'm not sure that there is a "Transmeta" of the graphics world; it's probably not appropriate to talk about such until next February when you might conceivably be able to buy some Transmeta product...

  • I find that it's better to be a smartass and go for the cheapo humor points.

    I wish I had a nickel for every time someone said "Information wants to be free".

  • I find it smarter to put "Go ahead and moderate me down for daring to say this" somewhere in my mail so as to come across as a martyr.

    Oh, and I bet I will get moderated down for speeking the truth on this issue, but I will always be ready to give of my karma to serve the readers of slashdot!

    -
    We cannot reason ourselves out of our basic irrationality. All we can do is learn the art of being irrational in a reasonable way.
  • Thank you for that incredibly well informed and wonderfuly articulated argument.

    My ATI All-in-Wonder card works great for me. I've had absolutely no problems with it, and now that ATI has relased the specs for it, I'm really looking forward to being able to use it for video capture.

    Is there any particular reason you don't like ATI?
  • I just wanted to mention that, IMHO, since graphics
    have become so big, there's just more and more fighting
    games.

    Good Thing: Games are more flexible. Good ideas can become better.
    Bad Thing: Since graphics have become less and less
    of a barrier, there's been less originality in games.
    I long for the days of nes, snes, and genesis, where
    game programmers had to come up with ORIGINAL ideas. Now it's
    almost always shoot-em-ups!
  • This is funny.

    Why are you looking to companies like ATI and 3dfx to build accelerators for realtime simulations? These companies build cards for people who want to play computer games. Duh.

    What you need is to rent time in one of The Caves [uic.edu].

  • Due to the complexity of the problems surrounding graphics there are a number of really innovative solutions coming to market. This particular chip and the nVidia product represent evolutionary changes, however there will be a number of revolutionary changes coming to market around mid-next year.

    A number of companies have specialized in developing really unique and interesting graphics engines which they are marketing to the big makers in the form of logic cores. They range from faster engines to render more polygons per microsecond, to more unique designs that render blocks of the screen at a time, to really uniqe designs that take polygons completely out of the picture. These engines will find their way into the next generation graphics chips very shortly.

    Additionally there have been a number of advances in the architecture of the systems themselves that will make significant advances in graphics. The advent of really high performance memory (Rambus, DDR, etc.) will certainly improve performance. And then there are some really uniqe designs coming that use copper-wire interconnect technology to give roughly the same performance as "system on chip" designs. And with all that, they die sizes of the chips themselves will keep shrinking (I think the new ATI chips are done on a .18 micron process). So suffice it to say, things are only going to get more interesting.

  • Okay, that is a reasonable answer.

    It seems to me that even with the benefit of "the specs" ATI video always seems to be "acceptable" or "functional" instead of "really nice" (ala TNT) or "great".

    I will continue to use Nvidia products, but I will refrain from calling people who buy ATI stuff names ;-)

    -Peter
  • The Voodoo2 may be outdated by today's standard (when compared to the GeForce 256, Voodoo 4/5, etc.) but it still does very decent graphics for a 3D-only card. I'd rather spend $50 on a Voodoo2 then spend $300 on a 2D/3D card just to play a game or two, maybe once or twice a week. Just remember that there is still a market for the Voodoo2 card/chipset as there is still a market for the AMD K6-2 350Mhz processor. If I wanted the best in everything and had the money to pay for it, I would still hold out... since all the new cards are, are just year-old video cards on steroids. $600 for the Voodoo 5 is just plain overkill and overly expensive!
  • I disliked ATI for its GL performance. I cried when someone came to me with a Rage Pro and wondered why it didn't get up to par with my G400, which I knew I was crying about because it couldn't beat the NVidia TNT2. *sigh* But ... but ... ATI is the most supported card in Linux next to S3. If they did release their specs for video capture I may *gasp* go and get one for my mother. I have a video capture card, and a g400. I'd like to combine these into 1 card and free up one IRQ. Actually: Dear Santa, I'd like DVD, video capture, 3d and 2d all in one card. With Linux support. Failing that how about 8 more IRQ's? :)
  • Seriously, though, sheer pixels per second is kind-of meaningless. (Actually, it'll be state changes per second. :) In most vector-based or polygon-based product, the bottle-neck is in calculating the outer perimiter of the shapes.

    Not always; depends what you're doing. With simple scenes (less than 10k polygons), and using texture maps, lightmaps, bump maps, translucency maps etc, at high resolutions (if you want to reduce aliasing), major pixel fillrate is needed - it's still a big bottleneck (one of many).

    Personally, I think pixel-based displays are a dead-end, anyway. Aliasing is horrible, and the techniques to get round it do so by making the picture too blurred to tell.

    Reality itself is aliased, it's just that the pixels are really really small. There's no such thing as pure analog. And anti-aliasing techniques do not soften or blur the picture at all (like filtering techniques do), they make the edges cleaner, but still crisp. Supersampling (a popular method of anti-aliasing that requires huge fillrates) duplicates what our vision does (the eye "supersamples" what it sees, by averaging all the rays that fall on a particular rod or cone).

    We don't have the retinal resolution to perceive the world around us at the atomic (let alone the Planck) level, so we see those nasty jagged edges as perfectly smooth. Once pixel-based displays get close to the limits of our eyes (1200dpi looks pretty good), we won't be able to tell the difference.

    Not to say there aren't more efficient ways of doing this, but pixel displays will work perfectly well.

  • Hey, ATi produces great graphics cards. The software, etc for them just aren't up to standard.

    I mean, the performance on all ATi Rage cards would be WAY better if they focused on producing excellent drivers for their excellent cards.

    Or, if their new chip actually does 2 gigapixels, they wouldn't have to make useless stuffs like the MAXX, which doesn't even match up to the GeForce 256.

    AW

  • I find the current obsession with 3d graphics and 3d sound rather odd.

    I myself am a casual gamer and played my share of Half Live, Descent and Freespace (try Freespace, it's good).

    Now if I look at any current game-capable home PC, I think those are quite a /bastard/ of a computer.

    - A graphics card that generates a LOT of heat, so much that it needs its own fan.

    Some graphics cards consume that much power that the mainboard's chipset starts behaving flaky. Some graphics cards consume more power than supported by not-that-old chipsets and can indeed create actual hardware defects on a mainboard.

    - My desktop's graphics card has 32 MB of RAM. Come on, my laptop has 32 MB of system RAM and I do close to everything on it, including word processing, database development, web server stuff...

    - Yet, the graphic's card awesome processing power and its RAM are being used for gaming only. I mean, most of the time at a computer I spend my time programming or using office applications. At these times, these resources are just idly wasting power.

    - What do these modern graphics cards do? They speed up 3D related calculations.

    - Isn't that more or less a specialized variant of what a digital signal processor does? Or am I naiive when it comes to a DSP?

    - Why do we have to have specialized chips for graphics, sound, win-modems and whatever when we could have used a single type of add-on chip to help the CPU? Why did the industry still not decide to put at least *one* versatile, programmable DSP on every modern computer mainboard? Those things are cheap, they are versatile, they could be programmed to speed up a whole bag of different algorithms such as audio (think live MP3 encoding, that was possible years ago with a DSP), graphics (Photoshop filters), encryption. Again: Or am I naiive when it comes to a DSP's capabilities?

    - Another related question: Why this odd decision to push 4-way sound cards for 3D sound? 5 channel home movie theatre stereos exist since a long time, why did noone in the industry decide to offer a simple card to connect with those.


    Oh my. I don't claim to be an expert on DSPs or 3D audio. But still, from the little I know, I think that the last two years in PC hardware design went terribly wrong...

    ------------------
  • To answer myself: Yes, I know that some time ago, IBM manufactured a generic DSP card that was used as a modem and a soundcard, and yes, I know that it blew.

    But I think that the concept is right, yet IBM failed to actually do it accordingly.

    ------------------
  • Will I have to connect your company's graphics card to a cup of tea? Now that would be a change.

    ------------------
  • I bought this card because I wanted to play Quake3. It was $50. Come on now, who's stupid, someone who pays $50 for a card he'll use a couple of times or some guy whe pays $300 for a card he'll use a couple of times?
    You are stupid.
    Very stupid.
    Don't you hate eating your own words?


    If you think you know what the hell is really going on you're probably full of shit.
  • "more and more fighting games"? I'm assuming that you mean 3D fighting games in the arcades, since the last 3D fighter for a PC platform that I can remember well was FX Fighter, which almost nobody cared about. As far as the rise of 3D in fighters; give me a good game of King of Fighters or Last Blade anyday.

    "I long for the days of nes, snes, and genesis, where game programmers had to come up with ORIGINAL ideas. "
    And frequently didn't. Can we say Mortal Kombat and [insert SNES Street Fighter clone here]?

    "Now it's almost always shoot-em-ups!"
    Please don't confuse 3D first-person-shooters with the traditional shoot-em-ups; as defined at Shmups [classicgaming.com]. ;)

  • It is basically a gutted PC. Wow. It is just a way to build hype. And the more they don't say about, the more people will just get annoyed and go away.

    Like Nintendo with their Project Reality, Ultra 64, etc. Even Nintendo now discloses more than MS. Nintendo has to be the most secretive and strange company. They do some of the strangest things, but they can pump out the games. They can make a good game and find companies that can make fun games for them.

    Can Microsoft do that? Unless they basically try buying their way in, they will have more of a problem than they expect. Nintendo and Sony have deep pockets. And they have no series characters. Like Pokemon, Mario, or Crash. And no Square.

    And at least Nintendo basically has disclosed general info. Like the IBM copper CPU, etc.

    I don't really care too much. Good games make people buy consoles, not the processor or the damn graphics card.
  • by heroine ( 1220 ) on Thursday November 18, 1999 @05:24PM (#1521066) Homepage
    We're entering a time when the only way to program 3D cards is manipulating the registers directly. There's no way you can get these cards supported by a library in time for the next generation of cards. We're leaving the age of abstraction and it's becoming more important for software to access the graphics hardware directly.
  • I think that's a bit of a temporary situation, really. Right now, almost all the first person shooters are competing for 'wow factor' in the area of visuals. Some of them do include some cool features and all, but, all in all they are the same game with beefed up graphics. However - there are companies that do currently produce games that make use of 3D hardware that aren't just the same FPS over and over. Dungeon Keeper 1 & 2 is a good example. So is the upcomming 'God Sim' Black & White (something I am REALLY waiting for!). There's new flight sims making good use of the hardware, Bullfrog's expansion of the Theme Park series, RTS games, etc., etc., etc. I also think you should look back in the past a little more about those 'orginal' ideas - for each original game there were a dozen that came right after it that tried to do the same game, but better. Looking back even futher, look in the arcades at things like Galaga, Galaxians, and the tons of other games that were just like them. For now there isn't a whole lot of innovation, but there is some. But, as the FPS market gets more and more glutted, and becomes less viable for new developers looking for a part of the action, 3D hardware will get pushed in other directions instead. It's a cycle that happens all the time, this particular one being not very different from the rest of the game cycles that have occured before it.
  • This is supposed to say this:
    Personally, I like the expanded colour range and on the fly hardware particle generation.
    All you need to do is press down on the sockets to reseat the firmware for a minute or two.
    Days of fun.
    Thanks for watching. I'll spell better next time.
    honest.

    Pope
  • What, is your graphics card not running hot enough already that you need to pour near-boiling water on it?
    Or are you looking to keep your tea warm?
    Either would work.
    Do wonder for those real-time particle generation experiments.
    But it just makes the QuakeBots jumpy.

    Pope
  • can you say EVERQUEST, massively multiplayer adventure game

    can you say Tribes!

    can you say Thief!

    can you say Half Life!

    theres more than just quake3 these days

    :)
  • That's the way it's always been. How do you think Pokemon became so popular on a 1 MHz processor with 8K of RAM, while Dreamcast games like VF3tb flopped with a 200 MHz processor and 26 MB of RAM?
  • You dont seem to understand the differnce
    between games and professional work. Tell me
    ONE single professional 3D program that run on
    Linux, (except Houdini). The O2 came out in
    1996/1997 and you are comparing a 200Mhz
    O2 made for _modelling_ with a brand new
    500Mhz ( or what you got ) P3 with a brand new
    graphics-card,made for games
    That's like comparing apples and oranges!

    And the fact that DD rendered Titanic on Linux
    doesn't say shit cause rendering is all about
    CPU-number-crunching and you only need an os
    that doesn't crash every second and
    doesnt
    cost and arm and a leg, so they could
    have used *BSD instead of Linux
  • Okay...dumb question...

    I can understand the V5's need for an external supply since the bus alone can't provide that much power. But...why an external brick? Why not an ordinary drive-style power connector internally that can tap the system's existing power supply?

    Gay! Totally gay! Liberace gay!
  • by Egorn ( 82375 )
    I had the pleasure of ordering a ATI Rage Fury and a Nividia TNT 1 at the same time and running them to test them both on the same computer. Originally I was going to put the ATI In my prefered computer but was disappointed to discover that as much as the ATI Rage Fury had so many plus features and twice the memory the drivers were either not complete or the hardware was not as good as the TNT1.

    I know what you might say: "Maybe your one card was bad..." I thought of that I ordered another one and had the same results.

    The other thing was that It ran the first few months as the second closest thing to vapourware after Daikatana. The only people that could get a hold of them (after their release date) were people who review hardware.

    Will the "New" ATi Rage Fury MAXX be the same story? If so then the niVidia GeForce will still be my First choice.

    Conclusion: Wait till they both have been released. If you can get a hold of one for free test it and post your results for others to see. And for those who can't.. Check online before buying either.
  • The OpenGL extension specification deals quite nicely with this problem. Hardware vendors can add new functionality through extensions without waiting for an official update to the library. (Non-extensible libraries like Direct3D will find themselves at a disadvantage.)

    As 3D chipsets multiply, software abstraction becomes increasingly important. Application developers can't be expected to keep up with the onslaught of new hardware. If manufacturers want developers to take advantage of their new hardware, they have no choice but to support standard libraries.

    I certainly wouldn't go out of my way to support a particular chipset if the manufacturer didn't bother to provide proper OpenGL support...
  • microsoft DO *DO COMPUTERS*.

    Viglen a british computer company that Amstrad took over have the rights to splash the microsoft logo over their machines...

  • Any information about the ancillary buffers available in the ATI card? I've been working with a TNT2 card and the results are pretty poor if you need stenciling or accumulation, using the 16 bpp X server provided by NVIDIA. I'm working with robotics simulation and I would be better served with an SGI box -- but most people (including the lab managers etc) can't see the difference between a high-end PC with a 3D card designed for games and a low end SGI workstation with a slightly higher price. And the 3D hacks in xscreensaver don't work in full screen mode with my card (in fact they work, but in single buffer mode). I've been offered an Ultra 1 box with a Creator card, which is slower than an average PC (ok, I guess an Elite 3D card would help me there). I'm now waiting for that workstation-class NVIDIA card, or perhaps this ATI card if it has the extra buffers.
  • If you RTFA (Read The Fine (!?) Article) :) at www.3dfx.com/products/voodoo/newvoodoo.html

    It mentions (in the interview I think) that the Voodoo5 6000 needs a brick because 3dfx feels it'd stress the normal PC PSU too much, lame I know but that's what they're saying. The V5 5000 and 5500 will have internal PC PSU connectors apparently.

    Hope that clears it up.

    --
  • arse.. I should check my own URLs
    here [3dfx.com] is the correct link
    --
  • That depends on what Transmeta have in mind.

    It's entirely possible to stack multiple processors onto one die.

    (That's how the 486 and Pentium work, merging the 386 and 387 processors into a single unit. The CyrixGX went one step further, merging the graphics processor in as well.)

    If Transmeta have a "universal instruction set" general CPU, merged with ultra-fast GPU and MPU, it would rip to shreds everything else out there.

  • Funny, I though the only "fullscreen only" chips around in the home-user arena were 3dfx...

    How come it won't do 3d in a window?
    My Permedia2 can, but only up to 1024x768x16bit.
    Have you tried to lower the resolution?

  • Don't forget synth applications!
    Most modern "analog modeling" synths use one or more DSP's as their sound-engines.
    Clavia Modular series, Yamaha An1x, Virus, etc, etc.

  • I'd like to see you render a scene with, say, 500 000+ polygons on a TNT2. =)
    You'll sink your puny little pentium since the card does absolutly *nothing* to help you. To get 85 fps with a low poly-count, all you is a high pixel fillrate and a texturemapper.
    High end card aren't made for gameing, so they put lots and lots of computational power into them instead of giving them a totally pointless high fillrate.

  • Your description of graphics processors, including "XORs, fills, tight loops oriented towards shading" describes the bottlenecks in 2D graphics chips well, but does not describe the problems one would need to optimize in a 3D graphics chip. The 3D bottlenecks are not amenable to a Transmeta-style solution, IMHO.

    I've done some research on the 3D graphics chips of UNIX workstations and PCs over the last five years and Transmeta's approach doesn't make any sense as a peak-performance graphics chip. While I must apologize to any real 3D chip designers out there for my generalizations and any misconceptions they may spot with their even greater experience,
    I'll try to summarize why programmable chips don't make sense to accomplish fast 3D:

    1) If the algorithm a chip must execute is fixed (as it generally is with 3D algorithms), nothing is faster than a well-designed hard-coded ASIC that lays down precisely the circuits needed for optimally accelerating that algorithm.
    2) If the algorithm varies substantially, a general purpose processor is more useful. In cases where there is significant unpredictable branching in the algorithm, a general purpose CPU will be optimal. In cases where there is strong data parallelism, a DSP will be optimal.
    3) Run-time programmable logic, such as FPGAs (field programmable gate arrays), microcode, or Transmeta-style programmable logic, has traditionally been best for cases where
    3A) you want to trade off top-notch ASIC speed for programmability in case your algorithm isn't debugged or you get flaws in the silicon
    3B) you wanted to accelerate a broad, flexible set of functions faster than, say, software, but didn't want the expense of a general purpose microprocessor
    3C) you really want to accelerate one algorithm now, but in few minutes, you want to accelerate another. With 3D you're changing the path picked through the 3D pipeline multiple times per frame, at every state change. You wouldn't be able to reconfigure your chip fast enough to optimize for that type of changing; at best, you might reconfigure your logic for a particular 3D game and the 3D primitives it uses.

    Transmeta and FPGA chips can of course accelerate 3D logic, but what you have to realize about 3D logic is that
    1) it is very branchy- lots of special cases depending on just whatever mode you're in (This makes DSPs and rasterization a poor fit, although Intergraph has used DSPs for geometry acceleration)
    2) the front end geometry processing is primarily floating point (vertex) matrix manipulation, the back-end rasterization is primarily integer (pixel) manipulation; your architecture must provide both types of execution units, in the right proportion
    3) the process of shipping all the various vertex, texture and triangle-to-pixel data is very timing-sensitive, requiring lots of dedicated FIFO buffers for optimal acceleration
    4) many pixel operations such as pixel walking or gouraud shading generally require very simple increment operations that don't require a full integer unit as one would find in a CPU or DSP (FPGAs would be better here, ASICs better still)
    5) the data paths between circuits on the chip grows practically exponentially as you go through the 3D graphics pipeline. Megabytes of vertex data turn into gigabytes of pixels. A general purpose CPU or FPGA are not typically optimized for this.
    6) moreso than FPGAs or CPUs, a 3D chip has to be optimized for huge output bandwidth to the frame buffer, both read bandwidth (for Z buffer and blending operations) and write bandwidth, with a separate set of data lines for reading in the initial vertex/texture primitives. The backend frame buffer bandwidth typically requires more pins than you'd have in a CPU/FPGA package. And most CPUs, DSPs, and FPGAs don't have such split memory controller setups integrated into the package, requiring a more expensive external chip.

    If this got too technical; I apologize for not having time to make it simpler and/or more precise. But nothing I've seen about Transmeta rings any bells as having promise for making a faster 3D graphics chip, something I'd be very interested in.

    --LP
  • Putting multiple processors onto one die is certainly an interesting idea; that doesn't directly address the "application-logic" versus "graphics-logic" issue, as the latter is evocative of different kinds of ALU manipulations.

    But that word "address" is the critical thing... In order to stack many processors together, and make use of them, you need considerable memory management hardware so that those CPUs can actually address memory, and not trample on one another whilst doing so. Parallelism isn't trivial to harness...

  • If you're that upset with the company, I would urge you to never, under any circumstances buy a product from them again.

    That's why I have a 19.2 modem instead of a cable modem.

    Companies only understand money.

  • That's certainly a fuller explanation of things than my knowledge can cover...
  • Get the TNT/TNT2 drivers from the glx CVS server, or snapshots. I guess you could find the website by looking for "glx", "mesa", and "tnt" on google.

    Xscreensaver will work double buffered (which means it won't flicker, if anybody else has that problem), but, expect that you will have bugs in other GL apps. Also, will be a lot less stable. An unstable X can and will crash the entire system. To top off that, no matter how hard you try to optimize it with -mpentiumpro cflags and such, the CVS driver will be slower than the binary released by nvidia.
  • Too bad it'll never get moderated up at this point, due to the day-late nature of my post. One of the major flaws with the slashdot approach IMHO.

    The only way I can think of to solve it would be to force moderators to read a certain set of posts that haven't been seen by other moderators (somewhat like the meta-moderation process.) I admit this would add an additional small discouragement factor to moderators.

    --LP
  • Shit, somebody remembered FX Fighter. I have that game somewhere in my closet or something.

    Anyway, you forgot Virtua Fighter 1 + 2 for Windows95. They are/were the definitive 3d fighting games. I whish they had more for the PC. Tekken 3 would sure as hell be nice. VF3 would probably run in some form on a PC. And while someones at it, make a linux port.
  • While I am personally in favor of giving people like Mr. Kaczynski here internet access so they can continue to add to the social discourse--and tickle our funnybones. Nevertheless, I feel a good majority of the nation's taxpayers would be mighty pissed to find out he is not only on "the net" but posting on slashdot. cph
  • ...funnybones, nevertheless...
  • I'm not sure posts get moderated down just because they say they're going to - in most cases.
    I think that people SAY that in their posts, because they feel like they're going out on a limb, or have low self esteem or something like that, and in reality, they ARE going out on a limb, and it's limbworthy thought that earns good moderation points.

    That's the optimist's view, though.

    I wish I had a nickel for every time someone said "Information wants to be free".
  • Digital signal processing is a technology used to take an analog audio or video signal, convert it to digital format (duh), and then takes advantage of its digital state to perform modifications to the signal: ie filtering out noise, or enhancing a video stream, etc. I'm willing to bet that video cards that have some sort of video input or a TV tuner built on-board have DSP chips to process the information.

    With 3D audio, there is a sound card that will output sound to a Dolby Pro-Logic Surround Sound capable receiver: Creative Labs' SoundBlaster Live! series, including the original card, the value edition, the MP3 edition, the platinum edition, etc. The first main software upgrade to the sound card (LiveWare 2.0) included the ability to encode audio in Dolby Surround format, so you can plug it right into your favorite home-theater stereo and blast yourself away. Believe me, it's fun. =)
  • The difference between a low end SGI and a high end PC is enormous. If you got a refurbished O2 with an R5200 you'd beat the pants off any PC box for the same price. With a relatively inexpensive upgrade you can get an R12000 instead. OpenGL support would of course be native and the graphics card would easily be comparable to a TNT2.
  • Umm .. I don't get it.

Any circuit design must contain at least one part which is obsolete, two parts which are unobtainable, and three parts which are still under development.

Working...