GeForce4 Ti 4200 Preview 248
Mike Chambers writes "Hi All, I've completed a preview of NVIDIA's GeForce4 Ti 4200 graphics chipset. Although the preview contains your typical benchmarks, it's centered around game play and antialiasing image quality. Here's a list of the games involved - Quake 3 & Team Arena, IL-2 Sturmovik, Nascar Racing 2002 Demo, Jedi Knight 2, Serious Sam 2, Max Payne Demo, Comanche 4 Demo, Dungeon Siege and Tiger Woods PGA Tour 2002 Demo. Since antialiasing image quality, especially Quincunx and 4XS, was an important aspect of the preview, all of the screen shots were saved in high quality PNG format. For those Slashdot readers that are avid gamers, you might want to check this out."
Good stuff (Score:3, Informative)
Re:Good stuff (Score:1)
But the main question is that, with software not being able to cope up with the pace hardware industry is going on, is nVIDIA GeForce 4 worth it??
Know your 'Geek Quotient' ? [resource-locator.com]
Hello, SPOILER warning? (Score:5, Funny)
More GeForce4 reviews (Score:1, Informative)
If this gets /.ed... (Score:1)
I'm happy with my old 32mb card (Score:3, Interesting)
That being said, I'm not against the new developments. It certainly does look like an awesome card, just seems to me that this particular market segment could almost be bled dry and these cards may have to find something else they are useful for to continue to survive. I dont have a deep enough understanding of the market or those in it to be able to make a serious call on it though.
I remember reading a long long time ago about developments that were looking at moving cycles across to other processors (i.e., big nasty graphics cards) that could be used to offset workloads when they weren't being fully utilised (99% of the time you aren't game playing). Anybody know what happened?
Re:I'm happy with my old 32mb card (Score:4, Interesting)
Well, there's no reason to get a GeForce 4 now unless you're a software developer or really need those extra 5 FPS in Quake3 (305 FPS instead of 300 FPS)... GF4 (and GF3) do offer significant advantages over older cards, but since the development cycle for high-quality games is about 3-4 years, compared to the development cycle of new graphics cards (6 months - 1 year), the game engines are always lagging behind. Nothing out yet even really takes advantage of what the GeForce 3 has to offer -- until Doom3 is released, anyway.
The best thing to do is just ignore these new card releases, and let other fools buy them just to be 'l33t'. In a year or more when you can actually buy games that will use GF4 features, the cards will be much cheaper than they are now...
Re:I'm happy with my old 32mb card (Score:1)
Re:I'm happy with my old 32mb card (Score:2)
Re:I'm happy with my old 32mb card (Score:3, Insightful)
Sure, I'll be glad when games specifically target my card, but for now, I'm enjoying some particularly clean looking software. It's worth the extra money to me, and it has nothing to do with being l33t.
Re:I'm happy with my old 32mb card (Score:2, Informative)
Well, there's no reason to get a GeForce 4 now unless you're a software developer or really need those extra 5 FPS in Quake3 (305 FPS instead of 300 FPS)
The words of a true non-gamer willing to expound their wisdom for all to see. I play the game Urban Terror (urbanterror.net), which is a Quake 3 mod, on a medium end system including a GeForce 3 Ti200 : I have to turn a significant number of features down to run smoothly at 1024x768 32-bit, and even still certain parts of certain maps slow to a relative crawl (crawl being 20fps or so : It feels sloggy and throws your timing off, not to mention that it ruins any immersion). Don't even get me started on AA, because in real applications (i.e. not a stock Quake 3 which virtually no one plays anymore) that is a frame rate super killer.
Re:I'm happy with my old 32mb card (Score:1)
It may even feel a bit like a conspiracy between the game developers and the 3D card makers. You may even think that the differences between last year's games this year's are real subtle. Just compare Quake I, Quake II, Quake III and maybe Soldier of Fortune 2 (released a day or two ago) to get a better appreciation of how far we've come.
CySurflex
(Unhappy with my old 16MB card)
Re:I'm happy with my old 32mb card (Score:1)
Re:I'm happy with my old 32mb card (Score:1)
Re:I'm happy with my old 32mb card (Score:2)
If it's an Abit, Asus, or other motherboard that allows you to seriously overclock, you might want to see if the motherboard will support a Socket 370 to Slot 1 adapter. That will allow you to run the Powerleap PL-370/T CPU upgrade (which should be released very soon); this will bump up the speed of your system from 450 MHz to 1,200 MHz!
At 1,200 MHz CPU speed, even the Matrox G400-TV should be fast enough to run most modern games.
Re:I'm happy with my old 32mb card (Score:3, Interesting)
That said, I'm upgrading to a low-end GeForce4 tomorrow. The Voodoo just doesn't cut it for more heavy-duty stuff, and it has no native MPEG decompression abilities, rendering DVDs a bit choppy at times. Also, it has no TV out - a feature "standardized" in cards just recently, and IMO, a necessary feature when used in conjunction with a DVD drive. Hence, an upgrade. There will always be people who have a) the money, and b) the desire to have the "latest and greatest", and they are the ones who fuel these new products, although at $200 it's not a bad price. I'm sure that companies take the fact that second and third generation hardware sells better...but it's not second or third generation if there isn't new hardware to succeed it.
Pfeh...I need sleep.
Re:I'm happy with my old 32mb card (Score:2)
Re:I'm happy with my old 32mb card (Score:1)
This rings a bell. The phenomenon is nothing new (note the date below!) and known as Wheel of Reincarnation. Quoting the Jargon File 4.3.1 [tuxedo.org]:
Re:I'm happy with my old 32mb card (Score:1)
However I hope progress isn't slowed as the holy grail is to achieve real photo quality in real time and that would be an awsome sight.
Which do you recommend ? (Score:2)
I want to creat 3D animation - the choice of animation software isn't yet set. The ones that I am looking at are Lightwave, Maya, 3DS Max, Blender and POVRay.
Which one do you recommend ?
On hardware side, which graphic card do you recommend ?
I am sticking with the X86 platform, OS can be Windoze, Linux, BSD, or BeOS.
All suggestions will be very much appreciated !
Thanks in advance !!
Re:Which do you recommend ? (Score:1)
I expect GeForce4 Ti4200 to sell in large numbers (Score:3, Informative)
There are two reasons for this:
1) It is less expensive to implement, so OEM's will be far more interested in installing this card instead of the much more expensive cards that use the Ti4400 or Ti4600 chipsets. Besides, the performance drop is not significant, so most users won't see any performance hits on even the latest games. This is why I expect many system builders to incorporate graphics cards that use the GeForce4 Ti4200 chipset onto new systems on a large scale by July 2002.
2) Because it is an NV25 chipset, it also means that the card will sport higher-level MPEG-2 decoding support. That means hardware assistance for playing back DVD discs as good as what ATI has done with their Rage 128 and Radeon chipset series.
I think you must like the Matrox G400/G450/G550 cards. Yes, they have excellent 2-D display, but the GeForce4 Ti4200 has vastly surpassed it in 3-D graphics and with the right manufacturer achieved almost as good 2-d quality display.
I'm deeply skeptical. (Score:2)
Everything about the damn GeForce.
First, it was having constant conflicts with Something-Or-Other during POST--I'd get a really annoying system beep and no video output, period. Yanked my SoundBlaster AWE32 and presto, it boots. Weird. Why was the GeForce 4 conflicting with my SB?
Now it works reasonably well, except that I'm forced to use my on-board AC97 audio (which sounds like ass, and esd really doesn't like it). Reasonably well, except for the occasional spontaneous reboot... which occurs for reasons I haven't been able to track down yet.
In Win2000 it's the same story--except that when I connect to the 'Net via my external modem (COM1), I'll randomly get a BSOD or a spontaneous reboot.
Why in the billion names of JR "Bob" Dobbs the GeForce 4 causes so many hardware conflicts, I have absolutely no idea.
When it's running, though, it's a pretty sweet board.
By comparison, my last card was a Voodoo3. Nice, simple AGP card; I plugged it in, it worked, never conflicted with anything.
Re:I'm deeply skeptical. (Score:1)
I have two dual machines. One dual pIII 600 on
a Aopen mobo. And one dual PIII 733 on a MSI
mobo. I have been experiencing lockups and
sudden rebboots on both machines. When using
various geforce cards. Everything from
GeForce 256 (sdr) to Geforce 3 64M. It is
very unstable and it mostly started after
nvidias release of the 6.xx drivers. So I
think they have done something that will make
SMP systems unstable. To get more juice out of
the cards. Since the 6.xx driver offered some
20% more fps in certain games.
I have tried to get some info from nvidia or
other sources on this subject. But I has been
unable to find any. I first tought it was my
current hardware setup that was the problem.
But now I have tried two different mobos and
some 5-6 different geforce cards (different
manufacturers to).
Its very annying in the long run since I am
also working with 3d computer graphics. And if
I work on a project I dont like a sudden
lockup.
-P
Re: (Score:1)
I haven't had your problems... (Score:2)
If it is the GeForce4, it's probably your particular card.
Did you try taking it back for a replacement, before telling us all how bad it is?
More likely you have some weird BIOS issues or power problems... you should check those, too.
Re:I haven't had your problems... (Score:2)
I think that could be part of the troubles.
I wonder does clearing the CMOS NVRAM and getting a decent 300 watt ATX power supply will help things along. Believe me, I've seen where clearing the CMOS NVRAM on the the motherboard fixes a LOT of Windows 9x/ME/2000/XP Plug and Play setup issues.
Re:I haven't had your problems... (Score:2)
Yah... I keep my BIOS as barren as possible, and I use a 400-watt in my tower =) Although from reading other comments, there may be a real issue with some SMP systems that people have...
Re:I haven't had your problems... (Score:2)
Re:Your problem (Score:2)
PC Gaming (Score:1)
From memory, the new Nvidia card was listed at around $350, and it can be noted that the tests were performed on a high end processor with a healthy serving of ram. Although this concoction transparantly serves as a powerful pc for all your non-gaming needs, does this serve as a warning to building 'game boxes'?
Even against Sony's impressive software library, I would argue that the PC offers the best range of gaming (quaking etc.), but with M$ entering the console market, will this be the case in times to come and is it possible we are dawning on a separation of mainstream pc uses and gaming?
Food for thought anyway.
Re:PC Gaming (Score:1)
You are referring to recent articles where M$ seems to be pushing games from the PC to a console so the PC can be the entertainment (har har) center of the house?
Well, fashions come and go every decade.. is it time for the games to move back to the console.. (only to have them come back to the PC in 5 years time?)
Finally, 128MB on a GeForce 3/4 (Score:2, Informative)
Now, finally, a memory upgrade and a visible performance improvement.
Re:Finally, 128MB on a GeForce 3/4 (Score:1)
Re:Finally, 128MB on a GeForce 3/4 (Score:2, Interesting)
Re:Finally, 128MB on a GeForce 3/4 (Score:1)
Hopefully the GF4s will break that trend, since 128MB is the rule there, and not the exception.
Geo
Re:Finally, 128MB on a GeForce 3/4 (Score:1, Funny)
Re:Finally, 128MB on a GeForce 3/4 (Score:2, Informative)
128mb vs 64 makes maybe 2-3% difference. They've benchmarked the same card (ti4200) with a 128 and 64mb model, and the 64mb model was FASTER due to more expensive ram.
THe BIGGEST difference in all the gf3/gf4 cards is the memory speed.
Graphics are still limited by fill rate in 90% of games, so if you have 10% faster memory, you get 10% faster framerate.
Problem is 10% faster memory costs 20% more, and so on, due to yield concerns.
All the 128mb vs 64mb will let you do is:
-run a higher res anti-aliased... but this doesn't matter if you dont have the speed
-use more textures... but all textures are now compressed in games (just about all) so they're not even filling up 64mb
benchmarks (Score:1)
all these screenshots... (Score:1)
games are evil, and the largest threat to widespread use of opensource software...
FYI: games and hardware (Score:1)
I have made 3 empirical observations of the game industry:
1. Games are about 5 years behind cutting-edge graphics research (academica (SIGGRAPH), mostly.
2. The graphics/engine programmers generally have the best hardware in the graphics team to allow them to test out the latest hardware advancements
3. Games nowadays takes around 2-3 years that the cutting edge hardware they use at first will become midranged by the time game got released.
I don't get it... (Score:1, Flamebait)
"The following is a list of the hardware and software used in this preview.
AMD Athlon XP 1800+ @ 1.53GHz
NVIDIA Reference Motherboard (nForce Chipset)
256MB Corsair PC2400 DDR RAM
21-Inch Sony Multiscan E500 Monitor
NVIDIA Reference GeForce4 Ti 4600 (300MHz/650MHz) - 128MB
NVIDIA Reference GeForce4 Ti 4200 (250MHz/500MHz) - 64MB
NVIDIA Detonator XP Driver Version 28.32
32-Bit Color / Sound Disabled * / Vsync Disabled / 75Hz Refresh Rate
Windows XP Professional / DirectX 8.1"
Ok, you're reviewing a card with 128mb of video memory, yet your main system memory is only 256mb? On WinXP? Dude, just shell out the extra $$ for at least 512. Unless using 2 DIMM's somehow cuts your performance. Who's using 256? Compaq?
Re: (Score:1)
Re:DDR (Score:2)
Good stuff... (Score:2, Insightful)
Yummy. I want one.
color me stupid... (Score:2, Funny)
The average user doesn't need his screen being blurred, the monitor does that well enough for him/her
"omg timmy! did you see those jaggies!"
"dude, don't be a magnafying glass hog!"
Re:color me stupid... (Score:1, Informative)
The human eye resolution is of 1/60 of a degree. So your screen resolution will match your eye resolution (i.e you'll see the jaggies) if your distance to the screen is X times the width of your screen (or closer), where
X = 360*60/(2*PI*1600) = 2.15
So I'm afraid you'd have to stand as far as 4 times the width of your screen to blur the jaggies with your eyes at 1600x1200 resolution (assuming you have a good vision).
(Sorry, I just feel like calculating stuff tonight.)
Re:color me stupid... (Score:1)
1600x1200 4xFSAA is like a 3200x2400 pixel screen, slightly blurred.
Re:color me stupid... (Score:1)
Geo
old news (Score:2)
Is this so special? (Score:2, Informative)
Gerb
Re:Is this so special? (Score:1)
Looks like the real deal! (Score:2)
If you want some more information, here's some good reviews/articles I saw today during my daily browsing:
Compare these numbers against Nvidia's previous attempt at the budget arena, the MX 440 here [sharkyextreme.com]. A much needed improvement!
Super nice!!! (Score:1, Interesting)
Get a grip dudes. Even if you don't like it, GF4 IS progress. What I am worried about though is the competition. Will ATI have the power to respond with an equally good video card ? if it doesn't and NVIDIA stays the only player in the graphics card market, we are doomed, and open source OSs especially.
here we go again (Score:1)
So is it time to drop the $400? To rely on buggy drivers rushed out by ATi or nVidia? To snarl at DirectX's mysterious problems, which may or may not be related to some of your older hardware not agreeing with your new card?
You've stared at the numbers on the site, and you don't see any reason why not. Did you know some sites exist (and make money) just by getting new video cards and "benchmarking" (aka "playing") them? Is this fair? Are you going to contribute to this universally unfair practice? Of course, you clicked through to buy from the first vendor listed on the site. You can hardly wait for the UPS man to come tomorrow (you can afford expedited shipping, you only paid 95% of what you'd pay at a retail store anyway).
As a savvy PC gamer, you've already downloaded the latest crack off Usenet. You never pay for software-why should you? The hardware costs enough as it is, besides, each game on the PC is just an iteration of Doom or Command and Conquer. Brainless blowing away, or boring resource management? You love 'em both. Or at least, they're available, and you play them.
You laugh at your buddies with an Xbox, because "I can build a more powerful system than that for half the cost!" You've scorned the Gamecube because "The Gamepurse is for kiddies!" Your Playstation 2, purchased for Final Fantasy X, lies collecting dust next to your DVD player (which sucks compared to the one on your computer-NATCH!)
You pause a bit to think about your computer purchases over the last year:
Now this GeForce 4 will be about $400, but it's worth it! Buy a Mac? Never! They don't have games, and besides, they're too expensive.
Buyer's remorse never seizes your temples with its steely vice grip. You'll never lose your job at the helpdesk, and even if you do, Mom and Dad will be there to help you out. You're a sharp guy, and you're surely going places. Right after this game of Return to Castle Wolfenstein, that is...
Re:here we go again (Score:1)
Re:here we go again (Score:2)
Re:here we go again (Score:1)
Do you only play games? (Score:4, Insightful)
As with almost all graphic card reviews, the only tests/benchmarks this review has is games. I don't know about the rest of you, but I actually don't play games the majority of the time I'm using my PC and therefore this review is sadly almost useless to me.
I would like to see a review that actually had a serious focus on 2D performance and quality.
No matter what, I'll not buy a Geforce4 card - AFAIK they have and need active cooling and I don't need that - I want a card with passive cooling! A Geforce3 TI200 should actually be able to run with only a nice large heatsink and that is what I believe I'll be buying soon. It is much cheaper too and it's 3D performance is still excellent.
Re:Do you only play games? (Score:1)
You'll have to go down to a GF2 MX or TNT2 before you get a card that doesn't need a fan at all. Your 2D performance will be roughly the same as the GF3 (though the 2D quality is noticeably higher on the later nVidia cards), but 3D performance will be significantly worse.
Geo
Re:Do you only play games? (Score:2)
I'm pretty sure I'm correct since the Nvidia Geforce3 TI200 reference card came with passive cooling only ;)
A lot of cards only come with a fan because it "looks cool" - not because they need it. It's the same thing with motherboards and the chipset. A lot of motherboards makers put a small heatsink and a fan on it, even though a larger heatsink would be enough. Take the latest boards with the Via KT333 chipset - a lot of the boards have a fan on the chipset - the Asus boards doesn't.
Re:Do you only play games? (Score:1)
FWIW, I was able to get one hell of an overclock on my card, with a pretty craptastic heatsink and a thin fan. I guess given that, you probably -will- be able to run at stock speeds with a heatsink and decent airflow.
Still, don't pull off the fan until you've tried it. Unless you're running a fanless system, I'll lay odds you won't hear the GF3 fan. Assuming your design doesn't divurge much from the one I have, it's really quiet.
Geo
Is it just me (Score:5, Insightful)
In the mid 90s, for some reason, something happened. Suddenly the mainstream opinion was that a game without 3D was somehow inferior to the 3D ones, so *everything* had to be 3D. Face it - 3D is just a gimmick like anything else. For most games, 3D is just wrong. It makes the interface bad and worsens gameplay. We humans are by nature not fully 3D-compliant (e.g see Rubik's Cube for proof). Imagine what a pain in the ass a 3D window manager would be (yeah I know some people research it, but that is their problem, isn't it?).
IMHO games are now in the childish state of "the more real it looks, the better". Now, I am certainly not opposed to the idea of beautiful games. I want stunning, great looking games. But where would art be today if it had stopped at the rather primitive notion that the painting that most resembles reality is the most beautiful?
I don't know about you, but when Heroes of Might & Magic III came out (New World Computing makes arguably the most beautiful 2D-graphics in the world), I was far more impressed by the beautiful details and the general mood that they managed to generate, than by the graphics of Quake III (or whatever FPS-clone was the current rave then).
Don't get me wrong, there are games that benefit from 3D (Tekken comes to mind), but not *all*. Is there even a non-3D game available for the xbox?
Damn the lemming mentality of the game publishers... Will I ever see stunning artwork again?
Re:Is it just me (Score:3)
Re:Is it just me (Score:2)
My guess is when 'Final Fantasy' (the movie) level graphics appear in games we will see a reaction against photo-realism and see some really creative ideas start popping up (most really terrible, but some great).
Re:Is it just me (Score:3)
Personally, I have recently been stunned by the quality of the graphics in Serious Sam 2E. I thought I wouldn't see anything more beautiful than Unreal/UT before U2 came out. I was wrong.
And what's wrong with playing the old games? I have wasted some major time recently on replaying Crusader and X-COM. Kickass games are worth keeping and replaying.
Keep in mind that good games come out rarely. That doesn't mean they don't ever come out.
I think you're missing the point (Score:2)
This is a review of a 3D accelerated video card. It is designed to render 3D games, so reviewing it with respect to how well it does that job is really the only useful way to discuss it.
I have no comment on your ideas about the merits of 3D gaming. I happen to enjoy 3D games a great deal. I also like chocolate and I don't like cheese. What of it?
A better one (Score:1)
[H]ard|OCP's first Ti4200 preview [hardocp.com]
[H]ard|OCP's second Ti4200 preview [hardocp.com]
What it's missing.. (Score:2, Insightful)
The 64MB card, at the stock clock of 500Mhz, outperforms the 128MB card at 444Mhz in almost every single test, obviously because of the large difference in memory bandwidth available from memory to core and back. The HardOCP review of the same card shows the 64MB card beating the 128MB by a few FPS in almost every test. The 128MB card should be the one sought after, but only because the memory on the 128MB card can be overclocked to exceed to 500Mhz memory spc on the 64MB card. You can always overclock the 128MB card, but you can't add more memory to the 64MB one.
Wish reviewers did a little better job of explaining why the reccommend things.
Hows the TMDS? (Score:2)
I've been looking to find good video cards for high resolution flat panel monitors but want them to be driven digitally instead of with an analog signal (even one sneaking in through the analog connectors in the DVI-I connector).
But really high resolution displays have been made useless for many graphics cards that only support resolutions up to 1280x1024 or 1600x1200.
I had hoped that the recent nVidia chipsets would have some good TMDS hardware.
Do they?
Re:the need for ... high fps? (Score:1)
This is quite correct. The game is extremely playable with a "lowly" 1.2 GHz Athlon with a Geforce 2 Ultra. I usually play it at either 1024x768 or 1152x864 resolution.
Re:the need for ... high fps? (Score:1)
Re:the need for ... high fps? (Score:1)
20-30 FPS is more than enough for a game like Dungeon Siege where little to no fast-action decision making or aiming is going on...Unfortunately these whiners are the same sort of FPS-(frames per second)whores that think you need 300 FPS in Quake (do their monitors refresh 300 times per second? Don't think so...yes I realize you want a high peak FPS so the low-end FPS is still pretty high, but even given that 250+ average FPS is just overkill for any game).
Re:the need for ... high fps? (Score:2)
Dungeon Siege, is one of the most beautiful 3D games I've ever seen, but reading on forums about it I realized that even a Geforce3 with a killer Athlon XP system would still only let the game run around 20-30 fps.
Well, I've only played the demo, but it's solid at 60fps on my Athlon XP 1600+ w/ GeForce2Ti. So unless something's horribly broken in the full version of the game, I'd call bullshit on those claims.
This is at 800x600x32bit with all the optional features (shadows etc.) turned on. It looks just fine.
Re:the need for ... high fps? (Score:3, Informative)
Re:the need for ... high fps? (Score:2)
From the readme doc accompanying the demo:
Reduced Visual and Audio Quality In order to compress Dungeon Siege to a reasonable demonstration size, many textures and sounds have been heavily degraded from the retail version.
Well root my boot. I totally missed that. Yep, true enough. Since just yesterday at work we were dealing with the consequences of texture-page-thrashing due to too-large textures, I can surely see how this would improve my fps..
But heck, the demo looks fabulous even with the heavily degraded features, doesn't it. :)
Looks nice. Unfortunately, after a solid afternoon playing it, I'm pretty much bored, and don't expect to be buying the game. Far, far too repetitive for me..
Re:the need for ... high fps? (Score:2, Interesting)
In time every element moves at is own pace, but the framerate is when those elements are updated on the screen. The rendering of this game is no different than that of UT2003, or any 3d FPS. They all use time based systems, so they don't run slower on old machines, only with a lower framerate.
Ever single update the entire screen is redrawn from scratch. Seems pretty insane, but thats how its done. The rasterization process takes a ton of triangles and turns them into 2D for your flat screen hundreds of times in a second. The reason for this is every time the 3d coordinates change, they will change the shape of the 2D scene you're seeing on your flat monitor.
Re:the need for ... high fps? (Score:2, Insightful)
I, personaly could care less about frame rates as long as the game looks nice . I've seen quake run at 40 FPS vs ~70 and could not initially pick out which one was which. After showing me which one was running faster I *think* I could see some difference with high speed turns but it just as easily could have been psycho-somatic. Then I have seen some games run at high frame rates and suck. Though, of course, like most other people, I *like* that number to be as high as possible, I just dont get mad when its not and the game still looks nice
Re:Hey (Score:2, Funny)
Too bad the Linux "d00ds" will not enjoy as great a gaming experience with this bad boy as us "Windows" d00ds. I could never see Everquest, Camelot, or anything like it as open source. Would ruin the game totally.
It's called Nethack, duh - which is why us Linux "d00ds" need to buy geForce4s to make text mode games..even faster! er...../me mutters incoherently and runs to corner.
Re:Hey (Score:2, Interesting)
Re:Hey (Score:2)
Re:Hey (Score:2)
Re:HEY (Score:1)
Hey! Welcome to the real world where quite a lot of Linux/BSD users have either two boxes or dual boot.
I have, for your information, two boxes: A 633 with Mandrake and FreeBSD on it, which does just about everything; and a 1.8G Intel box that runs windoze and basically runs games and needs to be reinstalled every 6 months (minimun).
This card I would throw in the Doze box, of course, for games, of course - Why bring linux into this? Anyway, in a while the linux drivers will be out
Re:HEY (Score:1)
I'll continue to dual boot until I can live without a windoze box for good. It's very close at the moment, if you don't realise. The only thing windows is really good for is games. See above comments about this.
If .NET takes off
I conceed your point about souls :( If you have an alternative, please speak up
Re:HEY (Score:1)
[nvidia.com]
http://www.nvidia.com/view.asp?PAGE=linux
They just added Geforce 4 support and OpenGL 1.3 extensions.
NVIDIA has their own linux driver team, and the drivers perform on par with Windows machines. The only think lacking is Direct3D support, which is useless under linux anyway.
Re:Oh for the days of good code (Score:2)
Seriously, what else is your machine doing while you're playing Quake3 or similar? It's not like you think "Well, that's the compile started, what can I do while I'm waiting? I know, how about a quick game of UT!"
I remember seeing a post to a newsgroup (c.o.l I think) from someone who'd just installed a bunch of RAM and was wondering why it was all being used. One reply explained what it was being used for (buffers and cache, of course), and said "no point having it if it's not going to be used!" Same thing applies here.
Sure, it's a bit rough on those of us with lower-end PCs (I have a GeForce3 Ti200, but "only" a P3 700), but them's the breaks. Time moves on and machine specs increase. Are you saying that games makers shouldn't target the machines that are going to be being sold at the time the game comes out? We'd still be playing Pac Man and Space Invader clones if that were the case. (Hell, Pong for that matter...)
Besides, I can assure you that games coders aren't thinking "How can I waste a few more cycles and a bit more RAM?". They're thinking of all the cool things that they wanted to do in their last game, but couldn't because the target hardware wasn't up to it, and how best to optimise it so that it runs acceptably on this game's target hardware.
Now, if you were to argue that an increasing number of games companies appear to be using flashy graphics and sound as a substitute for good, old fashioned gameplay, you might be on to something, but that's a discussion for another thread.
Cheers,
Tim
Re:Oh for the days of good code (Score:2)
I just disagree with our current "we have the resources to waste!
i disagree that the programs which will be designed to push newer cards are a "waste" of resources, but then again i do enjoy gaming... if you are happy with the graphics of older games you certainly dont have to upgrade. You have to face it though, if you plan on playing the latest games on your box you have to upgrade occasionally. It's the same in the console market, every once in a while you have to upgrade if you want to play the new stuff. If you're happy with the old stuff, then stick with it... and enjoy that extra money you save on something else
Re:Oh for the days of good code (Score:2, Insightful)
OK, then just show me the OS plus GUI or app with all the modern features, but only needs 16k. Oh, doesn't exist either.
Yeah, applications and OSs have become somewhat bloated, but development time and maintainability come way ahead of trying to save a few clock cycles or a little RAM.
And if you're such a big fan of the highly optimised, unmaintainable code of the past then why aren't running that stuff instead of moaning?
Re:Oh for the days of good code (Score:1)
Easy answer: I am.
And.. yes.. I do use the above, the GUI, the games, the bloated OS et al... but that doesn't mean so say that I like it.
However, the ultimate comeback here would be: "If you don't like it, do something about!". My reply to this unasked question: I am. :-) and thanks for asking.
Again, a point is conceeded to bloatware: it's very nice, and very useful. I am sincerely looking forward to seeing how hardware such as this will be utilized in the future. I await with baited breath to see the extensive uses that are explored. Meanwhile, I don't need it :)
Re: (Score:1)
Re:slashdot front-page summary (Score:2)
graspee
Re:Holy cow (Score:3, Funny)
Oh wow... you got a Hercules card?
Man, I was thinking of upgrading my text adaptor to one of those but I don't know if my old orange-phosphor CRT could handle it.
Of course what I really lust after is one of those BOB color cards. Man, they've got 640x350 pixels in sixteen (yes, count them... SIXTEEN) glorious colors!
What it must be like to be rich eh?
At least I've got 640K of RAM -- that's got to be more than anyone would ever need -- right Bill?
Re:Holy cow (Score:2)
What's most irritating about that isn't that he misjudged, but rather that he didn't fix his mistake, making us all mess around with EMM386.exe for several years.
Re:Holy cow (Score:1)
Basically, IBM engineers decided that from 0:0000 to A:0000 was decided upon as a reasonable amount of ram. That's 10 blocks of 65K, hence, the 640K limit. The rest of the space was used for memory mapped I/O.
Further, Bill would have loved to have not had to use EMM386 - that thing is/was a hack. And, Windows 95 onwards completely rewrite any magic that EMM386 does.
Re:Holy cow (Score:5, Interesting)
Funny, but just FYI I do have a working Hercules and I'll tell you something: it displays 80x25 text mode as fast as your GeeForce78 5000000 (or whatever is c00l today), i.e. much faster than I can read. I use it in my web/mail/ftp/dns server with 500MHz AMD K6-2 and 256MB of RAM. The advantage is that the 14" monitor is in size (the depth) between the smallest new 14" CRT I've seen and 14" LCD (something in the middle). Works great. Very low power consumption (the card and the monitor). Total cost: $0. Great for servers where you need a display but you don't want to waste a lot of space and power.
Ignore the NASCAR Racing 2002 observations. (Score:2, Insightful)
Re:I'm shocked...don't you know what framerate rea (Score:4, Informative)
Therefore, anything beyond 24 fps is USELESS! Basic biology, folks! And still we get these idiots going "hey it runs at 30 fps!". You can't see that! Your brain cannot cope with more than 24 still pictures per second before it "runs them together" to make moving images.
Repeat after me:
24 frames per second is the minimum required for fusion.
48 frames per second is the minimum required for lack of flicker. (Movies show at 48fps - didn't you know that? Each frame is shown twice, with a gap in between).
Anything higher than that, up to about 100fps, is better. Above 100fps, qualititative judgement of smoothness is pretty much impossible.
Don't believe me? Look at your 60fps (or 50fps) television set out of the corner of your eye - you'll see flicker.
Or, alternatively, if you're american, go to Europe and watch TV there. The flicker will drive you to DISTRACTION (NTSC = 60fps; PAL = 50fps). It's PLAINLY visible.
So in other words, a little knowledge is a dangerous thing. Don't assume that just because you heard somewhere that 24fps is the slowest speed at which images join together to make a moving image, that you can't see any difference between that and higher frame rates.
Simon
Re:Italians do it better (Score:2)
... only because your TV set shows each frame 4 times.
Si
Re:Italians do it better (Score:2)
It displays it twice.
Pal is at 50 Hz.
Re:Wasted Money? (Score:1)