Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software

3dfx' Voodoo5 6000 Still Alive 143

mr.blobby writes "3dfx' long awaited "big-daddy" version of the Voodoo5, the Voodoo5 6000 has been delayed almost as long as Daikatana but according to this news story, the card (with all of its four TMUs - texture memory units) and its external power supply has been sighted at a gaming trade show in London (ECTS) and is still slated for a release. There are a few benchmarks showing it beating NVIDIA's GeForce 2 which can't be bad. The author said this "the card was hitting around 50-60 FPS at 1600x1400", which seems most impressive."
This discussion has been archived. No new comments can be posted.

3dfx' Voodoo5 6000 Still Alive

Comments Filter:
  • Although I'm no expert on gaming (last one I played was Quake 1 , then dopewars a few years ago).. pushing for maximum FPS *Can* be a good thing , even if it is beyond a persons visual perception. I'm making this guess based on how the FPS checks are run... in a quiet battle field of quake 3 at 1024x768 brand "X" card can do Y frames/sec... now throw in three raging bots, two head hunters and all sorts of wizz bang visual effects and maybe rate Y with equal Y/2 .. ? (Or maybe CPU performance affects this much more than I think ? Somebody care to comment on that ?)
  • Man, you seem to be young.
    Every current vidcart has at least 20fold the memory of my first compus (zx81, c64) ...
  • There is already a much better [ultimatechaos.com] 3d video card out there... Go and check it!;-)
  • Remember that VooDoo2 [quantum3d.com] SLI (2 VooDoo2's on one chip with 24MB RAM)? It cost around $550 or $600 depending where you got it. People still yanked those up. And that was when video cards/games/extreme hardware wasnt as poplular as it is today. So this should be interesting indeed...
    As for loosing money, that may be true. However, 3dfx is still number one in sales according to this [yahoo.com] article.

  • What's the point of 50-60 FPS when broadcast video is ~30? Are we itching for a higher frame rate in movies and TV?

    Actually, yes -- we are itching for a higher frame rate in movies and TV. Thirty frames per second really isn't fast enough to have objects move quickly across the screen without appearing jerky. It limits the speed you can pan across a scene, and it limits the speed at which you can move the important objects across the screen.

    You'll probably never notice the jerkiness caused by fast side-to-side movement when you watch any professionally made movie. Why? Because cinimatographer's are very careful to avoid fast side-to-side movement. Since you'll never see them, you might not have ever even noticed that they're missing!

    The same thing also plays heavily into the limits of a first person shooter. How fast you can spin, how quickly other objects can move across your field of view, even the width of your field of view, are all carefully controlled by the game designers. You can only spin around so fast, before the walls and other players start to jump around instead of slide around, and the limits are controlled primarily by your frame rate.

    And, of course, frame rates faster than your video display hardware give the designer all sorts of interesting tricks -- anti-aliasing, motion blur, all types of interesting effects. And some of those effects (like motion blur) can make up for otherwise slow display hardware. So, a video card that can render a few hundred frames per second, but only display 70-80 (depending on the monitor) can still give game designers more leeway in game design -- there are a number of interesting tricks they can do before the video starts to appear choppy.
  • >My current computer

    or your current video card? Your video card has 160,000 Ko?

    My first computer was a TRS-80 with 4K. I wouldn't use that in this comparison, though. I would go as far back as the 386 with 4Mb that was my first real PC after graduating college. I thought it was pretty ridiculous to buy a Voodoo2 a couple years back with 16Mb, but here I got a Matrox G400 this year with 32Mb. The damn video card has 8x the memory of my first system. Think the mach-8 on that system had a total of 256K of memory.
  • This isn't a limitation in the methods used to create movies or special effects. Whether the method used was claymation, models, live sets, computer generated, or animation. Naturally when you move a model through space during filming there isn't going to be any stutter.

    What I'm talking about is a limitation in the medium itself, which in this case is film. Film is processed through a projector at 24 frames per second. So, in theory, if you move an X-Wing model across the camera's feild of vision at a rate of 24 inches per second, the model will have moved a full 1 inch per frame.

    If you slow down the rate of the model to 6 inches per second, the motion will be very smooth and the human eye will not detect the fact that it is stop motion. Now take that model and push it towards the camera at a rate of 48 inches a second. There will be huge 2 inch gaps between the frames, where the 2 inch changes become more and more obvious the closer the model gets to the camera since it is moving towards it. This is the same effect as driving past a mountain that is 50 miles away, it appears to be standing still while the telephone poles are whizzing by.

    For our final demonstration, take out one of your Star Wars tapes, if you have a VCR that can step through frame by frame it would help. Find a spot where a ship, energy beam, or anything flies close to the camera at a fast pace. Pause the movie, and step through one frame at a time. Notice the effect?

    Now, take this same principle and apply it to gaming. If you turn your field of vision 180 degrees, you are going to be moving a lot of pixels around quickly. If you have only 30 FPS you are going to only be capturing a set of pixels for every virtual 'foot' or whatever measurement is accurate, that you traverse. This means an opponent could be lurking in a shadow, and because your FPS is too slow, you missed the frame where he is visable. IF on the other hand you have a card that can push 60-80 FPS, the motion will be very clean, faster than the eye can see, and you will be able to pick out every single 'inch' of territory mid-spin.

    This whole topic really isn't technology specific, so where the stuttering originates is not relevant. The point is, dropped frames cause you to lose information.

    By the way, the jerkiness caught on film is not easy to detect. You practically have to be looking for it. For all practical purposes it isn't detectable. This is why the film industry really hasn't made strides to convert from 24 frames per second to something higher. There are only a few situations where it is really noticeable. For the most part, your eye is fooled.

    So if 24 fps is good for the film industry, why not the gaming industry? Like I have said before, the types of movement going on in a game are practically ALL the types of movement that show up as stuttery in film. Objects moving quickly towards you, ect, be it rockets or opponents. They generally do not film movies the same way a gamer plays a first person shooter, the audience would get sick if they did. :)

  • Like I said, I'm not sure. I know that I have heard the 60 number before. 30 sounds pretty low to me, otherwise you wouldn't be able to detect the 24 fps film discrepancies I was talking about in the above thread. The threshold would be too thin. Seeing rotors and wheels doing that weird effect wouldn't necessarily be proof of a 30 fps limit as they are rotating at a vastly higher rate. It could just as easily be an effect seen at anything over 60 fps. That is just an optical illusion based on something moving at a much faster rate than the fps limit. If we saw things at 400 fps, we might see the same effect yet. I wouldn't know though. I'm not sure of the equations for figuring that out.
  • It's not impressive on its own, but it's impressive that their chip scales. If they can miniaturize it a little and reduce power/heat, then who cares if your board has 16 processors on it?
  • by Anonymous Coward
    As far as movement goes, the human eye vertainly can't perceive 60. About 40 probably maxes it out. You can still see improvement when you go from movies (24 fps) to TV (30 fps in the US), but it's so small that anything faster doesn't show up in the charts.
    The reason you need high framerates is because today's screens have almost nil afterglow of their phosphors. It used to be that a screen was stable at 60 hz. If you run a hi-tech screen at 60 hz it flashes the brains out of your eyes. So you need something like 80 (my minimal refresh rate to work in) or above for the image to become stable.
    So, basically, if you want perfect smoothness you need to find a card that bottoms out at about 35 to 40 fps in each game you intend to play. And right now they all are around that point.
    I'd like to challenge gamers to recognise the difference between a card doing 40 fps and one doing 60 fps. The placebo effect is very strong, and until you do a blind-test you just don't know if your whizz-bang ultra-gamers card is really so much better than a regular run-off-the-mill standard card.
    You also see cardmakers realising this, because they're starting to push extra features (T&L, FSAA, ...) instead of pure fps.
    This ofcourse doesn't mean that you can buy a card now and run games on it at 40 fps for the rest of your life. There is such a thing as polygon count, and it matters. But, I think this whole fps obssedness gamers have is just like homeopathic medicine. It only works if you believe it.
  • The 'backwards wheel' effect is because of the difference in rotation rate vs. perceived framerate. If a wheel rotates 30 times per second, and you see it at 30fps, it will seem to be standing still (ignoring motion blur). Think on that a second.. the frame is captured with the wheel in exactly the same position as the previous frame... no apparent motion. If the wheel rotates at slightly less than the framerate then it will appear to rotate backwards. This difference in frequency can actually be decided by (wheel rotate freq.) mod (framerate).

    In the practical, as opposed to theory, motion blur and motion relative to other objects also strongly influence percieved rotation.
  • Linux is so marginal that it's hard to express. Name one company with an architecture that's been out for less than a year that has "dedicated" support for their Linux user(s).
  • Judging by the size of that card I have serious doubts about whether it would even fit in my case, given the layout.

    WOW!

  • Ok how many people have moniters that can do 1600x1400?

    This will require some serious cash to get a moniter that can take advantage of the newest video cards.

    That and the roughly $600 dollars for the card; who can afford all this anyway?

    I got the Geforce DDR for $300 and I thought that was insanly expenesive. No way I am spending ($600 + ($500 to $1000)) dollars just to play at resolutions that don't look THAT much better then 1024x768.
  • Framerate is not everything - in fact, I prefer picture quality over speed (of course only for some degree, but that's my preference - guess, I have a Matrox G400)
    It does not impress me to see 70 frames of crap in a second - not even when it is in a resolution that exceeds the capabilities of my monitor.
  • You eyes can see anything and everything, however, the brain can only interpit ~24 FPS.
  • Thing is the cards cost alot more to manufacture. With the new GeForce 256 Cards there is 256MB of DDR RAM. That's expensive, and with the Voodoo's they have to cram 4 chips onto a massive big bertha card with an external power supply just to *compete* with a GeForce.. makes me laugh.. and I'm sure it makes nVidia laugh all the way to the bank.. hahaha... 3dfx needs a new chip.

    JOhn
  • Read the article.
  • They chose to have an external power supply so there would be less draw from the power lines on the AGP card. Also, if the fans are powered by a drive power connector, there's less magnetic distortion near the card, supposedly making for better reliability, but personally, I think it's overkill (except for the 4 fans, it's good that those are powered off of an externa powerline). Still, the V5 6000 will be slower than a GeForce 2 GTS; I think that they pulled that 1600x1400 figure out of their @$$; that res isn't even 4x3 or 16x9, it's 8x7, too close to square.
  • by Black Perl ( 12686 ) on Wednesday September 06, 2000 @01:41AM (#802105)
    It won't be long before the CPU is a card in a PCI slot on your ATX videoboard.
  • The SharkyExtreme guy just guessed it. The FPS was probably 20FPS while the FSAA engine blended the frames together or something similarly screwy like 3dfx likes to do.
  • does anyone know the motivation for the external power supply? will i have to plug in DC from the wall, or just a hard-drive type plug from the existing power supply. what's next, video cards get their own case?
    -stax
    /. poster #104543567
  • The problem is that its not miniaturized. Have you seen pictures of the card? Its a huge hulking beast that requires its own power supply and has problems fitting in the AGP slots of many smaller-sized cases.

    Honestly, 3dfx seems to be living in an alternate universe where people still care about them being the first affordable consumer 3D card company...They really need to bring something stunning to the market next cycle or NVidia's (and even ATI now) are going to stomp them so far into the ground...

  • From an earlier post it was said that no one really cares about 3d performance, esp. at high res. I know I am not the only one, but I work and play at high res, usually the highest available. (Currently I am using a VoodooIII 3000 AGP at 1600x1200x32 @ 75) I love graphics, and I really like them when they are crystal clear and defined. My current Voodoo runs almost every game/program at high res like a champ. No lost frames, high speed, high detail, and incredible performance. I'd buy the v5 6k if it has better performance. Already tried a GeForce2, and didn't see much a difference at high res. Its a good piece of technology, but it doesn't have the performance where it counts, to me anyway. It was better in lower res tho, I will admit. I say "right-on" to 3dfx, tho the price be way to high. External PS? Who cares! There are already three powerstrips that go there!
  • Okay, this silly one's taken from here, [ultimatechaos.com] but I think that it would apply for the V5 6000 as well.

    DISCLAIMER: Will not fit in most PC cases without slight case modifications. Card comes with a Honda diesel generator for power. Not FCC approved. Keep away from water, small children, air traffic control towers, munitions dumps, and pacemakers. Extended use may cause cancer. Shipped via truck freight. Estimated retail price $5,743.56 (USD)
  • What's the point of 50-60 FPS when broadcast video is ~30? Are we itching for a higher frame rate in movies and TV?
  • Not likely too happen :-) But it's really weird that, for example, some current video cards have more memory on board than in my compus I bought some time (ok, years) ago. Jeez.
  • I believe the reason 3DFX gave for the external power supply was as follows:

    After testing many motherboards from multiple manufacturers 3DFX determined that the voltage supplied to the video cards was inconsistent. Now, due to the requirement for a stable voltage that falls within a very narrow range, 3DFX opted to go with an external power supply so the card always received the juice it needed.

    Personally, I think this card will be a complete waste of money. I used to be 3DFX all the way......not any more....

    Just buy a GeForce. It's a simply a superior card in all respects.

  • Doesn't sound like you play 3d hardware-accelerated games, why do you care?
  • by Enoch Root ( 57473 ) on Wednesday September 06, 2000 @01:47AM (#802115)
    ...vaporware?

    I mean, even the movie Battlefield Earth had good reviews before it came out. Never, ever trust benchmarks before they're made by people who were able to pick up the card off the shelves.

    Besides, the very shape of this story (spotted at a tradeshow somewhere) has the smell of urban legend if you ask me...

  • Too late :)
    check this URL, you'll love this site.
    http://www.evertech.com/accelerapci/
    I know this is WAAAAY off-topic, but what the hell, kharma won't kill me.
  • you have to plug it in the wall. it's a 100watt power supply, apperently the card uses so much power it would be too much for the standard ATX supply to handle. furthermore with their own (High-Quality?) external powersupply they can guarantee a good clean power 'signal'
    ---
  • by grahamsz ( 150076 ) on Wednesday September 06, 2000 @02:31AM (#802118) Homepage Journal
    Is anyone else concerned about just how much heat this will kick out into your system.

    A week or two back I put a Voodoo5 V5500 into my system along with a second 7200rpm drive, and now despite having extra fans having those two along with my Celeron300@464mhz i need to run with the case off.

    Now i'm not much of a gamer - the v5500 was about the only card around the £140 (uk pounds) price tag that had decent win2k drivers - but if it takes about an hour of normal windows usage to have my motherboard temp hit 50C (120-something F) surely anyone playing games on it would toast it.

    Now imagine twice the Gpus and twice the heat...

    I think soon we'll find graphics subsystems coming in a seperate box and at this rate it'll soon be bigger than your pc and require its own 3 phase power feed from a deadicated nukelea-r generator (homer's running mine :).
  • with all of its four TMUs - texture memory units

    The Voodoo4/5 don't have TMU's anymore. That was part of the old Voodoo archetechure (VD1/2/3). The VD 4/5 have VSA-100 (Voodoo Scaleable Archetechure).

  • by justin_saunders ( 99661 ) on Wednesday September 06, 2000 @02:33AM (#802120) Homepage
    Judging from the time demo that was shown I'd guess that the card was hitting around 50-60 FPS at 1600x1400 as well as 1024x768 with 4 sample AA, but that's only an estimate.
    Folks, this is an estimate. There are no hard facts in here at all - the framerate could have been half this for all we know. The author didn't even see the actual scores. And where are the benchmarks showing it beating a GeForce2?

    Jeez, give us something at least a few facts to argue about next time.

    Cheers,
    Justin

  • The main problem is the speed at which manufacturers are expected to act. They constantly throw current development out the window to embrace the "new" tech, which in turn, just starts getting hot, only to be thrown out again when something "new" comes along.

    But you're blaming NVidia?! NVidia is still releasing drivers for the TNT! How's that for legacy support. If there are any functions from the TNT-chipset that you miss on your Annihalator Pro I'd like to hear them.

    As for completely using a design before trading if for a new one, NVidia has released the GeForce256 -chipset three times now. As GeForce256, GeForce2 GTS and GeForce2 GTS Ultra, which are basically the same card with the exact same features. The only difference is the speed.

    I guess you must be talking about Bizarro-NVidia.

    A penny for your thoughts.

  • Ok how many people have moniters that can do 1600x1400?

    Well I do. My Iiyama VisionMaster Pro450 can do 1920x1440 according to the spec sheet, and I suspect a bit of modeline tweaking may persuade it to go a bit higher than that. It's not even that expensive. I'd recommend it to anyone looking for a decent monitor.

  • If movies were shown at >24fps (25 here in u.k. `cinemas`) then maybe i wouldnt get headaches from watching long films.
  • You're not seing a great difference with today's games because most of them (including Quake 3) don't use the GPU's acceleration of transform and lighting. Right now, I think MDK2 is the only game on the maket that uses it and many more are to follow suit real soon. The GeForce 1 is still ahead of it's time with today's games.
  • After all, nVidia has scheduled to release their next generation card in September. This is the card that supposedly uses the same chipset as the X-box (since nVidia is producing the chips for MS), which supposedly makes the PS2 look a bit like an 8-bit Nintendo.

    Sure, the card would've been really cool 2 years ago when they said they were going to release it, but so would the Bitboys card. Personally I think its funny that 3dfx is going to be releasing a card that will barely beat the last rev from nVidia.

    --
    "A mind is a horrible thing to waste. But a mime...
    It feels wonderful wasting those fsckers."
  • Heaven forbid that somebody build a scalable, basically object-oriented hardware solution

    yer, so? we're talking MEAT SPACE here. OO works for software cos disk, memory and CPU overhead are pretty much nil on todays hardware, but you can't upgrade meatspace - only so much fits in a case. And being PCs, scalability means jack. We want small AND fast...
  • Hmm... you work for nVidia huh?
  • There are good reasons multi-chip Voodoo5s will never be competitive: yet it scales, but it just doesn't scale well

    From the Voodoo5 FAQ on www.3dfx.com:

    4. Is the memory on the Voodoo5 boards unified or segmented? For example, on the Voodoo5 5500 AGP with two VSA-100 chips with 32MB of memory per chip, is the video memory 64MB or is it really just 32MB?

    The video memory is unified, only texture data has to be repeated for each VSA-100 chip.

    Gah? So if you get a 128Mb V6 and assume 4Mb framebuffer and zbuffer, it can hold 31Mb of textures? 93Mb just disappears?[makes indignant noises] I guess this is the only way they could get into the same ballpark as GeForce, but I bet the engineers had to hold their noses.

  • I also had a TRS-80 model 1 for my first computer. Had 4k of RAM and a cassette player. Then we got a 16K RAM upgrade and a 90K floppy drive. Man was I stoked.

    I remember reading a magazine called "Interface Age" back in the 70's. (shortly after I learned how to read, but anyway...) They had an article about how someday we would have video cards with a whole megabyte of RAM, and we could do high resolution color graphics. Even then, they knew that a lot of video RAM would be needed, they just didn't have the technology to do it (cheaply).

    I remember a few years ago when 1 meg video cards were the new rage, and you could do 256 colors at 1024x768. I wondered why anyone would ever need more than that. Then shortly after that, ATI came out with the 3D Expression card. It had two megs of RAM, and 3D instructions on the card itself. WOW. I bought one in PCI format, even though I didn't have a PCI slot to put it in. Got a PCI motherboard a couple of months later.

    Anyhow, my point is - as soon as you think that we have reached the practical limit to how good video cards need to be, they'll think of some new feature that needs a better card.

    Just as a side note - the Atari 2600 had 128 bytes of RAM. Yes, that bytes. And there was some pretty cool games made for it. (The games themselves were on ROM chips, but they still had a 128 byte limitation for storing character positions, scores, screen states, etc.)
  • The only time 3dfx was ahead in the 3d industry was way back in the banshee and Voodoo2 years. But ever since NVidia entered the game, they've been ahead. Voodoo3 versus Riva TnT, voodoo3 versus Riva TnT2 Ultra, voodoo 3 versus GeForce, Voodoo5 versus GeForce 2. Notice a pattern here?
  • He was probably looking at 15fps with the FSAA engine blending the frames together. 3dfx always takes the worst shortcuts like that.
  • Definatly, your FPS is going to be effected by the density of action around you. There is more to it than just action.

    It is partially shared by the CPU and the video card. Anything that involves calculation of object positions, their trajectory, collision detection, ect is going to eat into your CPU time. So if you are in the thick of a battle, 12 grenades are on the floor, 3 rockets in the air, 3 players, and 80 nails all of that is going to be eating CPU. This is going to slow down your FPS. Conversely if you are in a scene that is intense with texture and effects it'll slow you down.

    So you are right, you want to be pushing more than 60 fps in a still scene so that when you hit action scenes you are still at perceived human maximum. I meant to imply that the fps did not drop below 60 during intense processing scenes, meaning obviously that the fps will be a lot higher most of the time.

  • they have lost alot of money lately, and there vd5 card isnt exactly selling like hot cakes and i think adding an expensive product to there lineup isnt going to help.

    The thing I used to like about 3dfx is they could put pressure on nvidia to make better products.

    The vd5500 linux drivers still only support one of the vidcards processors, I hate to think of spending that much money on something that is only going to run at 1/4 capacity in linux.

    On the other side, last night nvidia released there xf86 4.0.1 v0.9-5 drivers. check www.linuxgames.com. Im using them now and they are working fantastic. Mostly stability and compatibility fixes. But open-source aside, nvidia seems to be more dedicated to supporting linux then 3dfx right now

  • by victorchall ( 169769 ) on Wednesday September 06, 2000 @02:52AM (#802134)
    A few of these anti- or pro- 6000 comments seem a bit off base, or are just irrelevant. First. So what if it take 4 chips? If a Dodge Viper with an 8 liter V10 smacks around the turbo V6 in the import (all else the same), who really cares? This alone is kind of a moot point. There is no law or rule saying you have to use a single chip or only 3.2 liters. As long as it meets emissions and gets the same gas mileage as its competitors, who really gives a rat's ass how they get there? Now, there are REAL penalties and issues that are caused indirectly with the brute force approach. Obviously price and gas mileage (or power consumption). The price is ridiculous. This we know, but it is not completely out of line compared to the $500 GF2 Ultra that it will compete with. The 4 chip/8 liter solution is not a good solution for the value/Ford Focus. The power issue is really a non-issue again because they will include an external 50W supply. If you've got $600 to blow on a video card, a $20 surge protector isn't going to break the bank. Now there is size. This is truely going to be an issue. Some people are already having problems with the 5500. I have a feeling the 6000 will be a real chore to install for those who don't have huge full tower cases.
  • Well, TV is interleaced anyhow. So alternate scan lines are draw at 60hz. Whereas, chances are your PC monitor isn't (you can get higher resolutions using this, but it looks awfull). TVs are blurrier at the pixel level, and so this isn't noticable.

    Also, monitors tend to be optimised for viewing text, not moving images, so they are sharper, hence jerkiness becomes more noticable.

    And finally (This has probably already been pointed out..), As most TV stuff is gererated from a camera (or cgi simulating one), that has a finite exposure time, there will be some motion blur, so movement appears smoother.

    All this, basically means that in order for moving stuff to look good on a monitor, you need a higher framerate. -Steve

  • Too bad their not as dedicated in supporting their linux users. If you have any problems don't expect any help from the e-mail address they supply. And the irc channel they mention is completely useless. At least with 3dfx, you know who the developers are and you have easy access to them. Ranessin
  • hey this rocks good open DRI drivers

    I have been debateing about geting nvidia card but they dont seem to support DRI all that well

    mesa support for 3dfx has always been good go to a BOF and find out !

    if 3dfx sort it out their DRI drivers could be the best their are ! and then truly you could kick winnt @ openGL

    the bench marks for the last lot of voodoos linux against win where nearly matched

    regards

    john


    (a deltic so please dont moan about spelling but the content)
  • sorry you had a bad experience on the irc channel. I've been hanging around alot there lately, and find it quite informative. Look for ripperda there, hes the nvidia developer working on the drives. Nice guy, quite smart and helpful. Also alot of guys from loki, wine, xfree86.org etc. drop by. kind of a fun place to hang out and learn more about linux in general.
  • ...and the fact that movies utilize motion-blur, which won't work for games. This because in a game you need to know the exact spot of an object at a specific time to be able to react efficiently. In a movie, you don't.

    This has been debated to death allready, I think :)
  • Yup, it goes into the 3.5 drive bay, forcing you to leave that empty, and effectively cutting the airflow space inside the case in two. This card would be a convectional nightmare, requiring you to leave the case open all the time.

    I like NVidia's approach, one GPU, instead of "as many VSA-100s as we can fit on a PCB!" I think that Alex Leupp desperately needs to rip those blinders off of his head.

  • Here [slashdot.org]'s someone with relevant experience. Do you think a similar arrangement would apply to the V5-6000?

    Sig gnomes are similar to underpants gnomes; however, the sig gnomes steal signatures.

  • You're just jealous because you're too cheap to afford a 3D Prophet2 GeForce2 GTS 64mb card.
  • Hell yeah, probably would need 4 peltiers on 4 peltiers to cool the 6000. Of course, overclocking the card is out of the question, due to the sheer heat.

    I see that you imitated my former sig; well, imitation is the sincerest form of flattery. I had to think of a new one because all the Linux gnomes were getting offended.

  • Its probably that they can't make the V5 sell well enough. The cips are only 166Mhz, it should be a cinch for any decent manufacturer to pull them off.
  • Has anybody seen these benchmarks that show V5 beating GeForce2? I have a suspicion that those benchmarks are against the regular GF2 and not the GF2 Ultra. Given the significant perfromance boost, and the fact that 4 VSA100 chips will not be twice as fast as two VSA100 chips, it seems that GeForce2 Ultra may just beat V56K, or at least come damn close. If that's the case, 3DFx has no hope. The V5 is slated to sell at around $600, and with 4 chips and 128MB of (redundant) RAM, I don't see how they can pull the price down. To tell the truh, I'm very pissed of at 3DFx. What used to be the leader of 3D performance and quality has now degraded to making second or third rate products and trying to use market hype to sell it. Not only the sticker fiasco, but their purchase of STB and them artificially keeping Voodoo prices high. There used to be a time when perfromance freaks wouldn't be caught dead without a Voodoo, and all Voodoo2 cards automatically got a "kick ass" award in Boot magazine. Now, a performance freak wouldn't be caught dead WITH a V5 card, and MaximumPC doesn't even bother to review their cards.
  • ...is the input rates. 3D games that make use of motion (just about all do as far as I am aware) rely on framerate as a basis for input time slices - i.e. faster framerate = faster input response cycle.

    Play a racing game at 30 fps and you will be all over the road because you only have (for example - I've not coded the stuff yet so I don't know the actual stats) 30 times in a second that the computer accepts input.

    Consider: you're heading for the rail on the left - you yank the wheel to the right; assume input ranges from 0 to 100 with 0 = hard left, 50 = center, 100 = hard right

    At 30fps you have input at 50, then in LESS than 1/30th of a second you are at 100. NEXT FRAME the system acknowledges a hard right, possibly reeling you out of control.

    At 60fps, VISUAL does not appear any different, but input is 50, 75, 100 and the game registers the 75, and you maintain control.

    This is WAY simplified, but hopefully you can extrapolate the point.

    I would appreciate if you'd excuse any amateur-ness to the post as it is my first on /.
  • You missed the point........

    1600x1200 is a common resolution, but how many monitors will actually do 1600x*1400*?

    And what games support such an odd resolution?
  • Man, you seem to be poor.

    My current computer has 160'000 times the memory of my first computer (ZX81, 1Ko).

  • Let me say that I did have a nice time hanging out there. All the folks were extremely nice, and they all tried to be quite helpful. Unfortunately, though, it did no good. Ranessin
  • If you want to use your car/finesse metaphor, the relatively new tile based texturing methods in cards like the Radeon may be representative. Tile based texture, when done properly, allows card manufacturors to not need truly ridiculous amounts of memory bandwidth. It will probably give us headway until some new memory tech comes along, because even DDR has its limits.

    matt
  • You get a nice $600 space heater to heat your room on those cold winter nights. :) ----------------- "Maybe, just once, someone will call me 'sir' without adding,'you're making a scene.'" - Homer J. Simpson
  • by Tet ( 2721 ) <slashdot@nOsPam.astradyne.co.uk> on Wednesday September 06, 2000 @03:32AM (#802152) Homepage Journal
    You missed the point........ 1600x1200 is a common resolution, but how many monitors will actually do 1600x*1400*?

    No, *you* missed the point. A monitor isn't limited to the resolutions it claims on the box. It's an analog device, and can be run at whatever resolution you want so long as it's within spec. I'd like to run my monitor here at work at 1280x1024, but the refresh rate is too low. The next standard resolution down (1152x864) gives me the refresh rate I want, but it's too small. As a result, I devised my own resolution, and hence I'm running at 1232x944, which is nearly as good as 1280x1024, but it gives me a good enough refresh rate that it doesn't hurt my eyes. There's no reason whatsoever that I shouldn't run my Iiyama at 1600x1400. In fact, having done some quick modeline calculations, it looks like I can do it at around 85Hz, so I may well do that tonight, to give me that little bit extra screen real estate. See the XFree86 Video Timings HOWTO [linuxdoc.org] for more details.

  • "I think soon we'll find graphics subsystems coming in a seperate box and at this rate it'll soon be bigger than your pc and require its own 3 phase power feed from a deadicated nukelea-r generator (homer's running mine :)"
    - grahamsz

    "I believe that within 100 years, computers will be twice as powerful, 10,000 times larger, and so expensive that only the five richest kings of Europe will own them."
    - Professor Frink

    coincidence?
  • Just checked the pics at Sharky Extreme and a couple of things hit me:
    1 The pcb appears to got into one of the drive bays.
    2 The board has a header below the last Chip/fan combo, If you have anything protruding from the mainboard directly in line with the AGP slot this board aint gonna fit.

    If it takes four of these chips to beat a GF2 ultra then it must be a pretty crappy chip when you think about. ATIs Radeon was faster than a GF2 until the release of the detonator 3 driver, so 3dfx is very much in thrird place as it stands really.
  • And the chances of there being any drivers for a Voodoo 5 6000 under Xfree86 at launch are? Somwhere between 'none-what-so-ever' and 'not-a-flippin'-hope' I'd wager.
  • Wow.. does anyone need to even say anything funny other than the specifications for this thing to get a grim chuckle.

    Its like who in the hell wants to pay MORE money for something that takes LOTS more power and lots MORE space for some bastard video card.. ugh

    Jeremy
  • >Yeah, designing a chip so that you can just keep adding more and more and more to increase performance is pretty crappy

    I didnt say I was against the principle of having multiple chips on a board (I cant wait to see if ATI do a MAXX version of their RADEON cards). If these chips were even close to Nvidia and ATI in performance then they would really have something special with these 2 and 4 chip versions, the problem is that these chips have been delayed for so long that they are in danger of being obsolete before they even get to market.

    Perhaps I should have said the performance of the VSA 100 chip is poor rather than "crappy".
  • These days... most can. Comfortably? Well, you probably won't be getting any use out of 90fps. But you can do it. Today it's almost totally dependant on amount of memory on the video card, most newer monitors will handle pretty much anything.
  • If visual quality is what you're looking for, the V6000 must blow you away with it's 4x anti-aliasing. My dreamcast only kicks a faction of the polygons a PS2 is going to, but the DC antialiases, and that makes it look much much better. :-)
  • I think I'll call the local electrical coop and see how long it will take to have another transformer mounted near my house.
  • Which is why I think they should come out with another standard. Like maybe a dual 300W power supply, with different adapters for certain situations. I know some people who Beowulf 2 computers together, and even run about a billion hard drives in the same computer. It could be useful for a bunch of diffeent applications, only prob is it wouldn't be cheaper to implement than the one they supply you with. Although it'd be a long term solution.
  • As a matter of abstract, ignorant speculation: how hard would it be to implement several modern AGP slots on a motherboard and plug in multiple cards either SLI fashion or SGI "extreme gfx/max impact" fashion?

    AGP - Accelerated Graphics Port
    It's not a multipoint bus like PCI, it's a port, like a serial or parallel port. The chipset would have to be designed to implement multiple ports to have more than one AGP slot. This would add quite a bit of complexity to the chipset. Adding complexity==lower yields==higher per unit prices. Not to mention that it would add complexity to the BIOS code.

    That would be just while using it with memory rich graghics cards. If you put memory poor cards that use system memory for texture storage (like the i740 did) then you are looking at an order of magnitude more complex for video drivers, especially if you wanted to share the memory space between the 2 cards.

    In the short term, forget it. You won't see multiple AGP ports on mainstream motherboards soon.

  • There is a Quantum3D logo on the bottom left corner of the board photographed in the article. Quantum3D have been claiming to have boards based on lot's of VSA-100 parts for months so this doesn't say a lot about the general availability of the 6000 boards from high volume OEM's. FYI Quantum3D offer board sets with up to 16 VSA-100 chips in a single graphics system.
  • The reviews of the GeForce2 Ultra from Sharky Extreme benchmarks Q3 at 1600x1200 around 90 fps.

    Check out http://www.sharkyextreme.com/hardware/guides/nvidi a_geforce2_ultra/7.shtml


    Dozer

    "The dumber people think you are, the more surprised they're going to be when you kill them."
  • The voodoo5 5500 has a 4pin molex connector for plugging into your PC's power supply, AGP cant supply enough power for 2 VSA-100 chips+64 meg 'o Ram. The V5-6000 needs 60w of power and 3DFX can't Guarantee your 'puters power supply , so they bundle one which will work with their cards, no 'My V5-6000 keeps crashing my PC' stories when people with 230w power supplies try 'em. Does make them a bit pricey though, and talk about an ineligant solution, both NVidia and ATI ( and possibly PowerVR-3, bit boys...OK maybee not them..) have much better solutions.
  • Actually a good point, motion picture and television (US) are pretty close to the same frame rates and it's nothing perceptible beyond that. More importantly: with that extra horse power I should be able to render more quality rather than going beyond 30 raster scans per second. Benchmarks are possibly attempting, by some skewed view, to demonstrate this. (more fps == better images at 30 fps)

    I think we're hitting the end of the curve for this sort of thing, unless we zap out and buy the latest Phillips TFT wall display at 1.6 gazillion pixels X .8 gazillion pixels. Sure, that'd be cool, but then you'll need the memory, bus, etc. to deliver that much stuff to the card - or - [Please wait for graphics to be loaded into 3dZilla - 11 Sec. remaining]

    Time to make the games better, IMHO.

    Vote [dragonswest.com] Naked 2000
  • So that's why I'm always compelled to buy Fritos, Coke and McDonald's food after playing Star Wars pod racer...

    Clearly this could be used for no good!

    <flash>Buy Big Bob's Colostomy Bags</flash>

    Vote [dragonswest.com] Naked 2000
  • Dude. "scalable?" The card extended into the tape bay! That is extreme overkill. What now are hardware manufacturers going to start selling their own cases with the card? I can see it now:

    "Try our Brand new, slightly larger, super duper state of the art ethernet cards can do 100gigs a second!"


    Disclaimer:Sun Microsystem Ultra Sparc computing system required to hook up card to your pc.... sold seperately.

  • by -brazil- ( 111867 ) on Wednesday September 06, 2000 @01:49AM (#802187) Homepage
    The motivation is that the PCI-internal power lines are too weak to support the card. And I suspect that it is indeed a separately plugged cable. As for video cards getting their own case, they exist and are called "SGI workstations"...
  • Well, you could also ask the point of high resolutions. If you're playing on your (high res) computer screen, 50-60 fps is pretty good. From 80 and up isn't too useful as it usually exceeds the refresh rate anyway.
  • by Contact ( 109819 ) on Wednesday September 06, 2000 @01:55AM (#802193)
    "The author said this "the card was hitting around 50-60 FPS at 1600x1400", which seems most impressive."

    In what, Quake 3? Windows desktop? Incoming? Talk about a pointless comment. I've seen benchmarks for Q2 (admittedly at 1024x768) clocking several hundred fps(!) on an old Voodoo 2 SLI rig by completely downgrading the graphics settings.

    In short, fps ratings mean nothing unless you know (a) the game in question, and (b) the config being used.

  • No-one's stopping you from buying a Socket7 board and running a K6-2+ 500 at 100MHz with whatever reduced voltage is possible and still works. Then, a fan should not be necessary at all, and with the right choice of periphery, you might get away with turning off the fan in the power unti, too. But the box won't be very fast.

    High performance, low price and low power consumption are conflicting goals; you have to make compromises.

  • FSSA done in hardware vs. software really isn't any different. In the hardware case, the hardware renders the image multiple times, while in the software case, the software instructs the hardware to render the image multiple times (or at a higher resolution or whatever.) However, there really isn't a difference in performance between the two because the software case is just a couple of commands per frame. The difference between the two is one between downsampling from a higher res to rendering multiple times. The benchmarks from Sharky Extreme seem to indicate that GeForce2 peforms better (the GF2 FSSA 4x is almost as fast as V5's 2x) while V5 looks better. I for one, however, can't really tell much of a difference between the two in terms of quality, I think that the extra features and performance of the GF2 is more important than the V5's slightly smoother image. In most cases FSAA is pretty useless anyway, because images rendered at the higher res tend to look better than a FSAA image rendered at low res (or downsampled.) However, I hear that V5's FSAA is really good for flight sims.
  • "...when the power supply gets kicked out of the socket?"

    3dfx mentioned something about this in an interview of theirs (Sorry, no link handy). Apparently, the card will stop working, but there will be no damage to the card or computer.

  • NVidia is still releasing drivers for the TNT

    But alas! They're still selling the original TNT chipset as well. That's not legacy support. Legacy support would be getting NVidia to release Windows 9x drivers for my NVidia-cloned Hercules 2000.
  • Now imagine twice the Gpus and twice the heat...

    It'd probably make sense to start worrying about ventilation on these cards, not just dissipation. Special ductwork to expel the heated air through the external edge of the card should be workable. There's not much area there though, maybe use an extra card 'slot' for an exhuast port?

    At any rate, this is a real problem for this product. While the extra heat might be acceptable in a normal system, this product is targeted straight at the diehards who WILL have an overclocked machine. They're not going to be happy with a card that buys them more fillrate but costs them 20% in potential CPU clock rate.

  • No, it was at ECTS -- really. It was harldy a box product though, so it's open to question whether what was being shown there is in any way representative of something soon to be on a shelf near you.

    You may remember Power VR series 2, first sighted back in the early Voodoo 2 era. It was demoed impressively on quite a number of ocassions, environment mapping and (for the time) high frame rates. But it was an awful long time until the Neon 250 was actually released (well over a year). The distraction of Power VR going in Dreamcast was a big influence there, but it's an example of how much can go on between a board being demoed and a product being sold.
  • by AbbyNormal ( 216235 ) on Wednesday September 06, 2000 @02:02AM (#802211) Homepage
    Does anybody see this as being a bit silly/messy? I mean I already have about 20 things plugged into a wall socket, why do I need to add another thing?

    The size of the card would also be my issue. That sucker is HUGE!!! That sucker [sharkyextreme.com] goes from one side of the case to the other! No mo room! I wouldn't be suprised if it came with drip trays and was endorsed by George Foreman. Just a LITTLE to big IMHO, but I'm sure some hardcore games will readily saw their cases in have and install an Airconditioner for it.

  • by Sneakums ( 2534 ) on Wednesday September 06, 2000 @02:02AM (#802212)

    The headline should read "3dfx's Voodoo5 6000 Still Alive".

    Assuming, of course, that it is.

    --
    "Where, where is the town? Now, it's nothing but flowers!"

  • by substrate ( 2628 ) on Wednesday September 06, 2000 @02:03AM (#802213)
    This product seems to me like something that needs the final nail in its coffin, if not I wouldn't be at all suprised to see it as the final nail in 3DFX' coffin. At a suggested retail price of 600 bucks I can't see anybody seriously pursuing it.

    It's too high priced for all but the gamers with the deepest pockets (it costs more than some PC's!) and 3DFX doesn't really have any penetration into professional graphics (too many years w/o 32 bit 3D and supporting proprietary graphics libraries instead of OpenGL).

    It's going to cost more money to produce than they'll make off of it (of course overall 3DFX loses money hand over fist, but they're being punished for it on the stock market).

  • A year and a half ago, the best video cards on the market were based more on the streamlining and design of the video cards, as opposed to seeing basically, who could stuff more things onto a board.

    I own a G-Force Annhilator Pro, and I am quite happy with it. I was just as happy with my old card, a Viper 550. It worked great, until Quake III came along.

    Unfortunately, the GPU on my Annhilator Pro doesn't seem to do half the job it claims it does. It's like I just went from a V550 to a V770. Sure, I get some more frames, but in the end, not *that* many more to justify the 300$ starting gate tag.

    Now, it seems every card is trying to move up in the world of GPU's. (Remember when it used to be RAM?) One chip, two chips, red chips, blue chips.

    It reminds me of the 60's era with Muscle Cars. Just stuff a bigger engine into it, and it'll go "faster". That's fine when you're driving in a straight line. After that played out, you started to see cars with finesse. Cleanly designed engines that went with body styles. Smaller, faster, (ok, not cheaper).

    The main problem is the speed at which manufacturers are expected to act. They constantly throw current development out the window to embrace the "new" tech, which in turn, just starts getting hot, only to be thrown out again when something "new" comes along.

    This is unsettling. While Video cards continue to get faster and faster, they seem to shrink away from what would be called a "legacy" design. Unfortunately for them, they seem to have lost the ability to completely use a design before trading it in for a new one.

    krystal_blade

  • by StarFace ( 13336 ) on Wednesday September 06, 2000 @02:06AM (#802215) Homepage
    Yes, and when a spaceship or some other fast moving object flies past the camera on television or film there is a noticeable drop in motion quality. Watch any Star Wars movie and you'll see what I'm talking about, everything gets jerky when the action is close to the camera.

    I believe I saw somewhere that the human eye cannot perceive anything above around 60-70 fps. So, the world around us would be felt at around 65 fps, give or take a number of conditions.(Somebody correct me if I'm wrong about this. Even if I am, the next paragraph still applies.)

    High speed, addicted gamers need 'real world' conditions to play in. If they are in the middle of a very fast battle with action flying all around, they can't afford to cut their world down to half the frame rate. You'll be missing half of the detail. Good first person shooters use insane sensativity on their controls, meaning maxing out that fps very quickly. If you've got a high fps you can execute a quick 180 degree turn and see everything WHILE you are turning, possibly tossing off a rocket mid turn. Lower your fps and you might have missed your opponent, lower your resolution and you have the same problem because of 'pixel mud'

    Seems insane, but you gotta understand the addicted gamer before it makes sense.

  • This card is like a gas guzzling SUV that Slashdotters make fun of soccer moms for driving around town. Any card this piggy with the power doesn't belong on the market.

    Ah, moderated down. There are some things that geeks just do not want to have pointed out, apparently :)

    Overall balance between power consumption and computing power is getting to be more and more important. Heck, I have a 400MHz machine that I use for heavy duty compiling and I have absolutely no complaints about speed. If I really, really wanted my machine to seem faster, I'd switch to a faster compiler. Using Object Pascal is tempting, because it is compiled on order of 100x faster than Visual C++. Or, I could switch to using an interactive environment (e.g. Lisp, Smalltalk), so I don't have to worry about compilation time at all. Or I could get a compiler that's three times as slow and upgrade to a machine that's twice is fast. Hmmm...what am I missing here?

    At the same time, there are people willing to have The Ultimate Graphics Card, even if they need to hook it up to a car battery and use a '75 Pinto for a heat sink. Is that wasteful in the same way that people drive four wheel drive vehicles in Dallas or Chicago because of the free-spirit image? Yes, of course it is. Let's not fool ourselves.
  • My computer with its 5 external SCSI drives make enough noise and consume enough power as it is. Now I gotta add another device that basically does the same thing? man...

    Video Cards are becoming so sofisticated that eventually, if not now, its like running a second computer within your main computer.
    Seriously though, if this card needs more power than it can get from an AGP slot, then maybe they should just hook it up to the internal power supply. Might have to create a new standard in power supplies, but I am sure this isn't the only card that'll head this route. [Unless someone comes out with the notion that smaller is better, type of marketing, with another card]. Wouldn't surprise me though. Every time I upgrade I have to buy a new case, cause the old one is obselete.
    I've got a 200W, 250W, 280W and a 300W, I have yet to get an ATX case. Still running the old P5A-B, with the AT 300W case. Still good. Refuse to upgrade until this Intel VS AMD pissing match is over (or atleast subdued).
    I haven't seen any specs, but it would seem to me that the card will probably not need more than 12V, which is what it could get from the power supplies without any extra wires or soldering. And the power supply can handle it. I came across some old Aptiva speakers that use the computers internal power (instead of a plug, it has a 12V adapter port attached to it). To power them I just ran wires from the 12V leads inside the case to a jack I rigged on the back of the case. Works fine. And I think the same could be done for the Voodoo6 6000.

    Come to think of it, I wouldn't be surprised if hardware sites came out with adapters for the
    Voodoo6 6000, after it came out, to power the thing internally.
    Just think if you were to accidently unplug your video card while your computer was running. I don't think it'd recover from that.
  • when the power supply gets kicked out of the socket? 1 proc only? total failure? partial failure? burned card?

The world will end in 5 minutes. Please log out.

Working...